Jermaine Campbell, Head of Sales, Surveillance segment at Seagate Technology, EMEA discusses the benefits of agile and effective data storage solutions.
Today’s cities are living entities. They develop, grow and become more complex over time, yet many of their most pressing yet mundane issues, such as the need for utility improvements and monitoring crime, remain the same. Like never before, city officials have the capabilities to implement analytics technology to help with a myriad of everyday city demands, in addition to more intricate challenges pertaining to security, healthcare, mobility, energy and economic development.
With more than half of the world’s population residing in cities, this need for smarter and more accurate insights into their everyday workings is monumental. City management officials could learn much from leaders like Cisco, Amazon and Google, which have made it their business to not just collect data, but constantly utilise it to improve livelihoods and communities. As we look to their successes, it becomes increasingly evident that the answer to creating smarter cities lies largely in surveillance technology that captures data analytics.
With the rise in surveillance technology and predictive analytics, we can make Smart Cities smarter and effectively, increase their efficiency. At the same time, it’s crucial to also understand that connectivity is never a guarantee and when real-time decisions are required, the necessary data must be present, regardless of connectedness. As a result, satisfactory amounts of local storage must exist to position the most perceptive data nearest to the point of compute, this speaks to the increasing importance of the edge, as well as embedded storage.
The growth of real-time data through edge analytics is causing a shift in the type of digital storage cities need, with fast, uncompromised access to data becoming ever more critical.
According to a recent IDC study, sponsored by Seagate, Data Age 2025: The Digitization of the World from Edge to Core, The Global Datasphere (a measure of how much new data is created and replicated each year) will grow by more than five times over the next seven years. The total amount of new data created in 2025 is forecast to increase to 175ZB from 33ZB in 2018. The major drivers of this growth are largely consistent across the world’s various regions but occur at different rates. Entertainment data and video surveillance footage have long been (and continue to be) significant drivers of the Global Datasphere. However, signals from the Internet of Things (IoT) devices, metadata (vital for analytics, contextualisation and artificial intelligence [AI]) and productivity data are showing even faster growth in today’s increasingly digitised world.
With all this available data there has never been a greater volume of insights at our fingertips and cities must step up to develop ways to use this data for good. In many ways, cities are already doing this, from intelligent street lights optimising routes based on traffic patterns to reduce emergency response time by 20 to 30%, to advanced surveillance cameras with analytics deployed to enhance security operations, leading to a reduction in crime by 30 to 40%. However, we can do so much more.
To be a true Smart City today, cities will need an “edge tier” approach to store, filter and manage data closer to the sensors. To gain deeper insights, the data is stored and analysed for longer periods of time in the edge domain as well as in the cloud or backend. A city will only have the power to act in real-time to find missing persons, notify residents of nearby emergencies and send out traffic congestion warnings through edge analytics closer to where data is captured and collected on network video recorders (NVRs).
The opportunities data analysis and data-driven urban improvement present are both hugely exciting and impossible to ignore. Behavioural analytics, thermal cameras and AI engines in edge devices like NVRs are just a sampling of the technologies that have given us the ability to remain constantly connected on a vast network. By horizontally interrelating individual systems, we can now develop insights into various mechanisms, including patterns in electricity, water, sanitation, transportation, environmental monitoring and weather intelligence.
Take for instance, West Hollywood’s Innovation Division, which recently received the American Planning Association (APA) Technology Division’s Smart Cities Award for the “WeHo Smart City” Strategic Plan. Its three-part plan consisted of strategies including: data-driven decision-making rolling out to departments citywide, collaboration and experimentation designed to enable City Hall staff to work better together and automation of processes to improve public safety and manage the built environment through Smart City sensors and smart building programs.
With data collected from predictive analytics based on Deep Learning activities in the back-end, in some cases for over a year, we can pre-identify trends to manage incidents in one sector that directly impact another. For example, if every time it rains, a river floods and causes traffic congestion to back up on the freeway, a city can now use surveillance technology to anticipate these weather patterns before they take place and direct traffic to alternative routes before the waters rise.
Cities need data in the moment and on the go, placing a larger demand on the edge to produce the predictive and reliable information required, often in real-time. In fact, reports predict that due to the infusion of data into our city workflows and personal streams of life, nearly 30% of the “Global Datasphere” will be in real-time by 2025.
That’s a lot of real-time data, so, how can a city implement surveillance technology to better secure a city and enable smarter analyses? The first step is identifying the right storage solutions positioned at the centre of a Smart City’s surveillance application that enable recordings, data retention, predictive analytics and real-time alerts. The next step is ensuring that this data is positioned at the edge, providing ample time for cities to analyse and make sense of patterns to then inform actions.
However, that’s not the end of the story. Edge computing is a new and complex way of organising IT infrastructure and there are several important issues around data management that need to be addressed before enterprises can enjoy the positive dividends of this new era of distributed networks.
Exponential data, exponential complexity
According to the Gartner report “How to Overcome Four Major Challenges in Edge Computing”, ”By 2022, more than half of enterprise-generated data will be created and processed outside the data centre or cloud; however, this data is different.” This introduces additional layers of complexity: every business will have a different set of needs and priorities for their data and there is no single industry standard for managing IoT ecosystems from edge to core to help.
Many organisations have moved at least some of their data to the public cloud, encouraged by convenience and rapid scalability. However, as the sheer quantity of data produced accelerates, this can lead to several issues around security, complexity and financial cost. Private cloud and hybrid cloud solutions solve this issue, but introduce their own problems, making managing data from edge to core more difficult. As we look to the future, we’ll see novel ways of addressing this issue, blending hardware and software to remove unnecessary steps and make it easier to move data to where it needs to be faster and more reliable.
Guarding against attack
It’s simple really: the bigger your network, the more opportunities you open up for cyberattack. For any business, the benefits of edge computing must be weighed against security risks and action must be taken to mitigate those risks. While data sat in a public or private cloud is probably quite secure, data at the edge is much less so by default.
Luckily, there are options suitable for most enterprises looking to enhance the security of their data. Hardware has a key role to play: it doesn’t matter how secure your software is if the data can be extracted by someone in possession of the physical media it is stored on. Data-at-rest encryption should be standard for all data across the network, with keys to unlock it stored separately from the data that is protected. With this level of encryption, even if the network itself is compromised, the underlying data remains safe. Over the coming years, hardware protection will increasingly be a critical component of wider enterprise data security strategy and the businesses that look at this seriously now will be in a solid position for the future.
Understanding the value of your data
As more data is stored at the edge, it will become more difficult for enterprises to get a complete view of their data and understand its value. For example, a manufacturer may have several different embedded devices in a smart factory, each producing different types of data. This data could be put to use, helping to identify efficiencies in the production process or flagging faults in real time. The challenge is in preventing that data from becoming siloed.
There is additional complexity too in that different types of data have different value to the business. As the Gartner report “How to Overcome Four Major Challenges in Edge Computing” notes: ”On average, a byte of data at the edge will have a lower value than a typical byte of data in today’s data centres. This is because a great deal of data is noise, requiring pre-filtering or basic analysis. Against this backdrop it is clear that more needs to be done to help businesses analyse this data at the edge and move it safely and securely around the network to where it is most needed.”
Enterprise networks become increasingly dispersed and complex, it will become more challenging to understand what data is important. Storing and managing data at the edge is going to become an important way for enterprises to manage their data growth now and in the coming years, but that comes with its own set of challenges. Now is the critical time: many businesses are at the start or early stages of this journey and thinking about data management, security and value at this stage will make the process of managing data from edge to core far easier in the long run.
On the same note, more than ever before, cities will need to come together to integrate their technologies and ultimately make their networks smarter using all available data in real time, a challenge that will require broad cooperation across its systems. Smart, agile and effective storage solution is the foundation to this strategy, ensuring timely data access and availability from edge to cloud.
To learn more on Seagate Technology surveillance storage solutions: https://www.seagate.com/internal-hard-drives/surveillance/
This article was published in the July 2020 edition of International Security Journal. Pick up your FREE digital copy on the link here