Businesses generate a trove of data and hidden within this data is information that can improve performance, boost efficiency, and deliver better customer experiences. However, extracting value requires firstly mastering the tools that will help make sense of it.
AWS commissioned Deloitte Access Economics to perform research into regional organisations data capability and maturity, and this was released today as the Demystifying Data 2021 report. A key finding is 62% of Australian and New Zealand organisations report a basic or beginner level of data maturity, and further, only one-third of businesses expect to move up the ladder within the next five years.
Yet, enterprises of all sizes have much to gain by unlocking the value of their data. AWS has been running a two- to five-day intensive program out of six locations (Seattle, New York, Herndon, London, Bangalore, São Paolo) which brings together AWS experts and a company’s internal builders and subject matter experts to solve complex data challenges using AWS services. As of today, there is a seventh region, namely Australia and New Zealand. The lab comes at no cost, yet provides real value both in terms of the learnings and access to AWS experts, as well as the construction and delivery of a real application over the course of the program.
For those not ready to build, but who would still receive value from an AWS architect’s recommendation, there is a cut-down half-day to two-day engagement known as the Design Lab.
The AWS Data Lab is run locally by a team reporting to Vicky Falconer, AWS Data Lab Manager, and then - depending on the customer’s specific scenario - brings in local AWS data engineers with deep technical skills and experience in analytics, databases, and machine learning.
|
The idea is to create a team focused entirely on helping customers to build real solutions addressing real needs by thinking big, starting small, and scaling fast.
“Customers want to know how to effectively use data analytics, but they often struggle with how to get started,” Falconer said. “We always start with a business problem, and then we work backwards from there. We take a big problem, and we scope it to something we can address in a few days. This is an effective way to validate whether a great idea will solve a problem.”
iTWire experienced the data lab for itself, with this writer participating in the AWS Data Lab pilot and thus able to share genuine experiences. In our situation, we applied for the program through our AWS account manager which required explaining the real-world business problem we wished to solve. While the AWS Data Lab is free, there is a finite limit to AWS engineers who can participate. However, our business case was approved and as the date for our Data Lab drew nearer we had fortnightly, then weekly, pre-lab cadence calls with Falconer and her team.
Here, we worked through the problem we had, identifying the sources of our data, determining who would be the right people in our business to attend and what their skills and limitations were, and taking crash courses in specific AWS technologies like RDS, DMS, Glue, and others. An AWS architect proposed their solution which then formed the basis for what we would actively work on during the lab.
The week of the lab came by, and four people from our organisation, combined with several experts from AWS locally and also from Virginia in the US, spent four days building a real, working pipeline of data extraction, transformation, and loading switching between DMS, S3, Glue, Lambda, IAM, Redshift, Quicksight, and other products as needed. We certainly hit challenges along the way which required some solid thought and investigation to debug, but we genuinely ended the four-day lab with knowledge and with a working prototype in our dev AWS environment.
We have since expanded on those learnings, and while we changed direction somewhat - based on what we now knew about the AWS tools - we have successfully implemented the aggregation of data, both streaming and batched, from a variety of disparate data sources into a data lake for fast reporting independent of the transactional systems the data came from.
For me, the AWS Data Lab was a tremendous experience and one I’d recommend to any organisation who knows they can do great things with their data but don’t know the right tools to use, or even what the appropriate solution would look like.
“Customers learn best by building and getting hands-on experience,” Falconer said. “We are there to guide them, but our intent is to take customers on a journey. We want them to walk out of the lab with a solution and the skills to take it to the next level.”
Locally, other AWS customers trialling the Data Lab pilot include Intellihub and TEG. Globally, Data Lab customers include Nasdaq and Dow Jones, along with hundreds of other customers creating innovative technologies across healthcare, education, media, entertainment, and non-profits.
“Every customer is different, so this isn’t a cookie-cutter approach where we have one solution that we use over and over again,” said Falconer. “Diving deep into customers’ problems makes the program unique.”
More information is available here.