Big Data in action: smart cities meets energy

This article was originally published on Technology.Info.
As part of our continuing strategy for growth, ITProPortal has joined forces with Technology.Info to help us bring you the very best coverage we possibly can.

In Germany and the US, two very different big data projects are underway, but both with the same goal: optimising the way people consume energy by analysing the data generated by smart meters.

More efficient use of energy and water, fewer traffic jams, better public safety: these are just a few of the promises that the ‘smart city’ concept makes. It’s vital to human health and security that these goals are achieved: according to figures from the UN, more than two-thirds (70 percent) of the world’s population will live in urban areas by 2050, stretching resources, space and possibly, residents’ patience, to their limits.

But in order to be smarter, a city needs data - a lot of it.

Sensors and meters

embedded in utilities networks, in particular, are already capable of delivering data in huge volumes, but if city authorities and utilities firms are to be able to understand energy usage patterns and potential supply issues, they’ll need a hefty dose of

big data technologies

to collect, manage and analyse that information.

This idea is central to a giant construction project currently underway on an abandoned airfield on the north-eastern outskirts of Vienna. It’s the site of a new smart city, Aspern, and by next year, if work goes to plan, there will be around 3,240 apartments there. By 2028, Aspern will be home to around 8,500 apartments, along with shops, schools, health centres, offices and a subway station that will transport passengers to or from central Vienna in just 25 minutes.

It’s also the site of one of Europe’s most ambitious

smart energy

projects - a “living laboratory” in the words of its creators - where researchers hope to establish how renewable power sources, smart buildings and smart-grid technologies might best be combined to power a thriving, eco-friendly community.

That research will be lead by Monika Sturm, head of R&D at Aspern Smart City Research (ASCR), a €40 million joint venture formed in 2013 between the City of Vienna, utility provider Wien Energie and industrial giant Siemens.

The plan, she told attendees at data warehousing company Teradata’s recent customer conference in Prague, is to kit out individual buildings at Aspern with different combinations of smart-energy technologies and analyse the results using a range of big data technologies: traditional data warehouses, MPP [massively parallel processing] appliances, and open-source data analysis framework Hadoop. As we’ll see in the next article [xxxx], the big-data trend is, in some cases, setting these approaches in direct competition with each other - but many organisations continue to embrace whatever tools they need get analytics work done.

“By analysing the most efficient mixes of technologies and their influence on end-user behaviour, we expect data analytics will lead to new paths for

energy optimisation

in smart cities everywhere, for the benefit of all.”

In California, meanwhile, utility company PG&E is somewhat further down the line: it’s already the largest US utility to have installed smart meters right across its service territory, which covers 70,000 square miles and 9.4 million residential and commercial properties. Now, the focus is firmly on extracting real business value from that roll-out effort, says Jim Meadows, PG&E’s director of

smart grid technologies.

The smart meters that PG&E has installed measure energy use in hourly or quarter-hourly increments, allowing customers to track energy usage throughout the billing month and giving them greater control over the costs of heating, cooling and lighting their homes. They also give PG&E more visibility into its own operations, as well as lowering the costs associated with meter reading and management.

But these 9.4 million meters generate a mountain of data - around 2 terabytes per month or 100 billion readings per year. This is collected by the company and stored for analysis in PG&E’s Interval Data Analytics (IDA) platform, based on a

data warehouse

from Teradata. Analytics tools from SAS Institute and Tableau Software, meanwhile are used to interrogate the data.

“We’re doing our best to focus the company, and all of its different lines of business, on a single data platform for this interval data,” says Meadows. “We made a conscious decision early on to build a platform where data could be cleansed and perfected in a single place and made ready for presentation to business users in a wide range of different ways.”

As at Aspern and PG&E, countless other public servants and utilities executives are planning big data projects - and a huge cast of hardware, software and services vendors will be more than happy to assist them.

The main drivers of smart-grid analysis investments, according to a recent report from research company GTM Research, will be to improve asset management for grid components, bring more granularity to demand-side management, speed up outage response times - and achieve a better return on investment for smart meters. GTM Research expects culmulative spending to total around $20.6 billion between 2012 and 2020, with an annual spend of $3.8 billion worldwide in 2020.

Topics