The technologies that will empower the smart cities of the future are refreshingly free of controversy. The concept itself is an easy sell: after all, who doesn’t want buildings to be greener, energy networks more efficient, traffic congestion and pollution reduced?
So, while smart cities will require billions of pounds of investment in new technologies over the coming decades, there is an almost unanimous consensus among politicians, businesses and citizens about the benefits that it will bring, says Jackson Lee, vice president corporate development, Colt Data Centre Services.
But while money, political will and public appetite are no barriers to the development of smarter, healthier cities, there is one fundamental issue that threatens to delay our progress towards this future utopia – IT infrastructures that were designed for a “dumber”, less data-intensive age.
Data demands of the smart city
Cities are built by people, but not necessarily for them. Business, industry and profit were the main forces that drove the organic, sprawling growth of the world’s great metropolises – not its citizens’ health, happiness, and convenience.
Smart technologies, including the internet of things (IoT), promise to solve many of the eternal problems that city-dwellers have been forced to put up with, from toxic air to gridlocked streets to public safety. New services will transform the way that municipalities manage public transport, shared infrastructure, city planning, waste and recycling, lighting, smart grids, and a host of other benefits such as access to healthcare, education or local government services.
But these new services all depend on the ability to generate, process and analyse previously inconceivable volumes of data. From sensors measuring air quality around the city, to the thousands of CCTV cameras monitoring transportation systems; from smart energy grids to “intelligent” bins that tell refuse collectors when they need emptying – the smart city will be a network of computers that is constantly creating huge amount of information.
This information is the raw intelligence that goes into making the key decisions on which smart services depend. Everything from transportation systems to smart grid and industry-related applications depend on instant, high-speed, ultra-reliable connectivity between the device collecting or generating this data, and the systems which process and analyse the information. This, however, is where traditional technology infrastructure threatens to delay – or even prevent – us achieving the full benefits promised by smart cities.
Living on the Edge
The traditional cloud model made a great deal of sense. Concentrating storage and processing power in hyperscale facilities enables users to take advantage of huge economies of scale to manage intense, high-volume workloads at a manageable cost.
Unfortunately, this model works far less well for the demands of modern smart city technologies. When storage, compute and analytics are located at a centralised hub – which may not even be in the same country, let alone the same city that the data is generated – it necessarily adds a lengthy lag.
This time delay might only be measured in milliseconds, but it can still have significant knock-on effects on services and applications that rely on instant communications, such […]
Read more here:: www.m2mnow.biz/feed/Posted on: January 12, 2018