Big Data Market Segment LS
Big Data Market Segment RS
Tuesday, 21 April 2020 15:18

Edging towards intelligence everywhere

By Bob Yang, Seagate Technology
Bob Yang, Regional VP Sales APAC, Seagate Technology Bob Yang, Regional VP Sales APAC, Seagate Technology

VENDOR CONTRIBUTION by Bob Yang, Regional VP APAC Sales SEAGATE TECHNOLOGY: In the data age, nanoseconds matter. But at the same time, the sheer volume of data generated by Internet of Things sensors means centralisation is no longer the answer. Hence the rise of edge computing with tiered architectures powering artificial intelligence.

There’s a bigger picture to go along with the talk around the Internet of Things and artificial intelligence. As technology advances to the point where autonomous vehicles, sentient software and robotics help “intelligence everywhere” become reality, the structure of it all must change. Pushing computing horsepower to the edge is now necessary, and as it rolls out, old hands may just get a sense of déjà vu.

Computing horsepower at the edge is becoming big business. In an October 2018 report, McKinsey & Company identified 107 distinct edge use cases, estimating the potential value of edge computing at $175B–$215B by 2025—and that’s just the value for hardware companies. A 2018 IDC study sponsored by Seagate forecast that by 2025 30% of the world’s data will need processing at the edge. Yet another report titled ‘Edge Internet Economy: The Multi-Trillion Dollar Ecosystem Opportunity’ puts the value of the edge at $4.1 trillion by 2030.

The big question is ‘why?’ Why is the edge emerging as a new and rapidly growing frontier in computing, in an environment where the cloud is all but de facto for so many use cases?

It’s perhaps worth taking a quick and massively simplified walk down memory lane. Industry veterans will recognise IT 1.0 as mainframes and terminals; not unlike cloud computing, except distances to terminals were a lot shorter. IT 2.0 was driven by the rise of PCs and distributed computing, with (relatively) powerful endpoints in client-server models, and content consumption. As components shrank, IT 3.0 emerged on the back of mobile devices and the rise of the cloud.

Right now, IT 4.0 is well underway, characterised by the talk, and in many cases reality, of artificial intelligence, robotics, industrial IoT, autonomous vehicles and large-scale industry transformation.

At each point along the way, data generation grew exponentially. The basic unit of a hard drive went from a capacity of single-digit megabytes, to 16 million. The amount of data generated in 2016 alone is estimated at 16 zettabytes. One zettabyte is a million petabytes; by 2025, IDC estimates the world will produce 175 zettabytes.

Not for nothing, then, that market watcher IDC calls this ‘the data age1’.

Sensors…and making sense of data

A proliferation of devices is behind the creation of all that data. Much of it comes from sensors – but, as you’ll appreciate, simple sensors just don’t make a lot of data at all. Temperature, wind speed, location – that’s the relatively small stuff.

Cameras, on the other hand, generate enormous files of ‘unstructured’ data and organisations will increasingly depend on AI to make sense of it. The depth of that data also represents enormous value, going well beyond the metrics pumped out by a simple sensor – allowing for, for example, the operation of an autonomous car.

In fact, the autonomous car demonstrates most readily why powerful computing at the edge is a necessary characteristic of IT 4.0. Because on the road, nanoseconds matter and latency of milliseconds or microseconds can have disastrous effects.

That means the vast amounts of data an autonomous vehicle generates must be analysed, contextualised, understood and acted on quickly. Some of that happens in the vehicle, to be sure. Some of it happens elsewhere – but that elsewhere can’t be the cloud. Not only can the data not travel there and back again fast enough, but the pipes just aren’t big enough.

Multiply that single vehicle example by a hundred, a thousand, a million high-volume data use cases and the scale of the problem is apparent.

Endpoint, edge and core

This simple example shows why IT 4.0 must have three distinct architectural structures: the endpoint (in this case, the car – but it could be a drone, a phone or an industrial IoT device); the edge (a cellular tower, an office building or other server-equipped facility); and the core (the cloud and traditional datacenters).

It’s a tiered approach; when convenient and when capacity allows, the data feeds back to the core for further analysis where it will ultimately enter a standard data management process. The tiered approach also puts the lie to any contentions that ‘The edge will eat the cloud’, as provocatively suggested by Gartner VP and analyst Thomas J. Bittman. Co-existing, the cloud and the edge each complement and mutually enhance the value of the other in a classic ‘the whole is more than the sum of the parts’ way. As always in the IT industry, it is about use-cases and appropriate solutions.

It’s a fact put forward by the author of the Edge Internet Economy report, Chetan Sharma: “It comes down to understanding which apps and services will benefit from edge architecture. For example, if you are in manufacturing or construction industry and you manually assemble machinery, you can benefit from automation and deploy the edge to speed things up. Whatever your business, you need to ensure the infrastructure is ready.”

What this means in practice is that with the edge/cloud structure, intelligent applications like the autonomous car can function effectively in the field (everywhere), with crucial data processing taking place on the edge, at the last mile.

The rest happens in the cloud, when convenient.

1  Data Age 2025 – The Digitization of the World, IDC



Recently iTWire remodelled and relaunched how we approach "Sponsored Content" and this is now referred to as "Promotional News and Content”.

This repositioning of our promotional stories has come about due to customer focus groups and their feedback from PR firms, bloggers and advertising firms.

Your Promotional story will be prominently displayed on the Home Page.

We will also provide you with a second post that will be displayed on every page on the right hand side for at least 6 weeks and also it will appear for 4 weeks in the newsletter every day that goes to 75,000 readers twice daily.




Denodo, the leader in data virtualisation, has announced a debate-style three-part Experts Roundtable Series, with the first event to be hosted in the APAC region.

The round table will feature high-level executives and thought leaders from some of the region’s most influential organisations.

They will debate the latest trends in cloud adoption and technologies altering the data management industry.

The debate will centre on the recently-published Denodo 2020 Global Cloud Survey.

To discover more and register for the event, please click the button below.



Webinars & Events