large sensor networks and smart cities.
What we do
We are providing software solutions for smart grids development and helping utilities to get value from it by intelligent data analytics.
Highly adaptable MDC/MDM solution covering interoperable massively parallel device communication and configurable data processing control such as the accurate estimation of missing values, complex validations, the detection and classification of behaviour patterns, pointing up anomalies and forecasting.
Powered by stream processing (Kafka) and distributed database (Elasticsearch).
Suitable for advanced pilot projects where flexibility and creativity are needed – such as voltage sensing, load control, flexibility between production and consumption.Read more
Secondary substation balance predictions, the detection and prioritization of phase asymmetry, estimation of the order in which phases are connected at individual connection points, the detection and prioritization of voltage problems, load balancing of substations containing local PV production, consumption dissagregation to detect potentially flexible consumption such as air-conditioning, electrical heating, hot water boilers or EV charging.
Experience of processing more than 150 millions of AMM measurements.
Ultra fast predictors working notably faster than neural networks and allowing the operation of thousands of automatically adapted predictors in parallel.Read more
Smart grids, smart cities and IoT have lots in common, a lot of sensors and a lot of data. This data can be processed and provided as a service for third party applications or for users.
Our data processing platform can integrate various data sources and discover hidden connections and correlations.
The platform provides precise and complex forecasts, time and space aggregations, trend detection, segmentation and anomaly detection.Read more
Smart meters and sensors provide a continuous stream of millions of measurements and events. The sole function of a one star system is to cleanse, validate and store. A one star system works very much like a data warehouse, allowing data to be stored and queried. This is a necessary function although insufficient to allow the most effective use to be made of any data measured. Querying data, which grows daily by hundreds of millions of measurements, is no easy task for users either.
Filtering is the simplest way of processing data. It creates an event when the measurement (e.g. voltage) is either higher or lower than the threshold defined; A two star system can both filter and alert. Would users, however, want to be confronted with thousands of individual alerts each hour?
A system deserves a three star rating when it is able to detect clusters of related events, analyze data in relation to grid topology and detect situation and behaviour patterns. For example to inform users about a specific substation feeder which has been experiencing repeated voltage problems since last Spring which occurs when a specific photovoltaic source is generating a lot of power.
A four star data processing system will warn you when an incident is about to happen. Being able to predict consumption, production and reaction to changing levels of demand is vital for efficient and effective grid management. Accurate information of this kind is key to switch from a reactive to a proactive mode of control.
Grid behaviour is continuously evolving and changing. New behaviour patterns occur, priorities evolve and new threads arise. Fixed data processing rules are very soon out of date. Adaptive algorithms are necessary to detect and understand all changes and anomalies.