by John Morrell, Sr. Director of Product Marketing, Datameer
is really “the little engine that could.” While this may seem like an oxymoron, allow me to explain. When broken down, is really just a collection of seemingly infinite small details and information points in different silos. But all of these data points are really part of a much greater puzzle that, once “solved” and articulated, has the power to fuel digital transformation.
But to get to this point, organizations must transform their data, which often comes from a variety of disparate sources/silos, into re-usable, consumable and executable data assets.
So, how best to go about fuelling your big data analytics.program to derive the most value? Read on for five ways to boost your
1. Get Agile!
Traditional analytics drew heavily on the involvement of IT teams whenever changes needed to be made to the ETL and static data warehouses, ultimately resulting in a complex and convoluted reporting system. With this layout, new forms of analytic results could often take months to obtain.
But now, with tools like Datameer and Hadoop offering a schema-less architecture for analytics, business teams are able to work on raw data, curating and engineering it for quick consumption. All of this occurs within a simplified metaphor that relies on easily understood spreadsheets and visuals.
An added benefit of this visual approach is its tendency to reduce data production process inefficiencies. Since the data is delivered in an easily consumable visual format, the number of processing iterations that occur between IT and business teams is lessened because data preparation and refinement are so much more intuitive.
The results of this schema-less approach? Reduced times to data delivery – Often going from months down to days and, in some cases, as little as a few hours. Not to mention the re-usability of the components/models created during the production process.
2. Deliver More Data
Finding ways to take all of your raw, siloed data and turn it into something meaningful should be a critical component of yourstrategy. But why stop at simply transforming your data into useful information? There are ways to not only extract insights from the raw data, but also to speed up this process, giving the organisation faster and more efficient access to actionable data.
One key way to achieve this goal is to create a “cooperative curation process” between key members of your data analytics team. Such a process creates a “Data Village” of sorts, wherein data engineers, business analysts, CDOs and business owners are brought together within a single toolset that allows them to synchronize the data curation process and then subsequently execute on the data using their favorite BI tools.
Accomplishing this level of collaboration across a variety of roles requires a tool set with a strong visual component that provides the ability to graphically see the shape and aspects of the data in a free-form manner without dimensions or restrictions. The tool must also be capable of performing its analytical function at scale – combing through billions of records and thousands of attributes to drill down and across to the most important information hidden within massive quantities of data.
Datameer Visual Explorer
Datameer Visual Explorer provides the responsive architecture capable of this type of large-scale data exploration, with a backend that delivers sub-second response times. Its schema-less architecture also allows you to perform free-form exploration directly on your data lake because it doesn’t rely on fixed, pre-determined models/paths. This means there’s no need to migrate your data into your network or systems in order to interactively explore it in any direction you choose.
Datameer Visual Explorer also allows you to quickly pre-aggregate your data on the fly, dramatically speeding up the exploration process. And, because there is no pre-computing of indexes, extra storage requirements are reduced, making exploration on your data lake far more resource efficient. The benefits of being able to perform data curation, preparation and exploration all in one integrated stack? Faster analytic cycles and complete control of governance, all in one place, on one product.
3. Power New BI
The BI andworlds today are still in need of more interconnection. While many organizations work with big data, they tend to keep these activities distinct and separate from their BI tools. But, there are many similarities and potential synergies that could exist if we bring BI teams and Big Data teams closer together – to use a term from earlier: We must create the “Data Village.”
In order to do this, we must first ask the key question: What am I trying to achieve by integrating BI with Big Data? If the answer is that you want to engage in new age, Big Data-powered BI, then you must involve the role of the power analyst. These are individuals that will explore big, new questions revolving around digital transformation – How do I get to know my customer better? How do I deal with omni-channel customer engagement? How do I drive better customer acquisition and retention processes?
Answering these questions requires free-form data exploration that doesn’t limit the investigation process with pre-determined schemas and methods. Herein lies the “sweet spot” for forward-looking organizations determined to drive new value and action from unrestricted access to vast quantities of data. Big Data-powered BI truly enables digital transformation because it emphasizes agility and the blending of new datasets to drive action. This stands in stark contrast to more traditional approaches that simply take big data and run it through existing BI processes.
Unfortunately, many organizations are attempting to recreate their EDW stack on top of Hadoop in order to achieve new BI insights. The result is often the same type of data latency, inefficient data movement, and disjointed governance and security infrastructure issues that existed in the EDW world.
The solution? Use your data lake as the BI accelerator. Put all of your data into the data lake, facilitate and curate your data assets, and bring your business analysts directly to the lake to perform free-form data exploration that will go a long way towards answering new questions and driving business agility.
4. Use the Cloud
There are several reasons why businesses are deciding to take their analytics to the cloud. Chief among these is a desire for scalability, however, greater flexibility, lower costs, faster response to business and reduced IT involvement also make the list.
But, for many organizations looking at a move to the cloud, there are a couple factors that bear consideration. First, businesses recognize that, while attractive, the cloud isn’t their only way of performing analytics. The cloud must marry with the initiatives they have in play on-premise to create a hybrid infrastructure that facilitates both.
Secondly, as always, security is a top priority. When working with big data in the cloud, businesses must always be certain they are utilizing tools and platforms that give them the same level of security in the cloud that they can achieve on-premise.
So with these two factors in mind, what should businesses be looking to get out of the cloud when it comes to their big data?
- Increased Business Agility: The ability to spin up resources in the cloud as needed to allow business teams to run an analysis, crunch data and work with the data on an ad-hoc basis
- Follow Data Gravity: The ability to land data in the cloud when it’s most convenient to reduce unnecessary data movement
- Elasticity: The flexibility to scale to accommodate varying workloads in an on-demand fashion
- Remove IT Barriers: The ability to engage resources in the cloud without having to wait for involvement from IT.
To ensure the above benefits of working with your big data in the cloud, it is essential to choose a solution with a hybrid, cloud-first architecture that separates compute from storage to guarantee the level of elasticity needed to scale with your workloads.
5. Deliver “Big” A.I
Operationalizing AI in today’s business world can be tricky because the process to do so is very customized in nature. This process is called Re-implementation and involves the use of large, costly amounts of custom coding to generate business-ready A.I frameworks that are often difficult to maintain and integrate with other systems.
To solve this dilemma, organizations must utilize a tool that marries the building of data pipelines with A.I insights. Datameer has done this by creating its SmartAI feature, which allows integration with A.I and machine learning frameworks (like Google’s TensorFlow) to allow data preparation, feature engineering and blending that optimizes the data for the A.I framework to work with.
Once prepared, the data can be run through training processes and used to create a model that represents a business problem that needs to be addressed. Once created, the model can then be re-ingested by Datameer and, with a single button, deployed to operationalize the data pipeline and start enriching data insights.
The benefits of utilizing A.I in this manner are faster times to deep learning insights, the ability to deploy models directly on the data lake and the relief of avoiding maintenance issues involved with custom coding. Not to mention the fact that the entire process remains secure and governed within the organization.
The Long and Short of It
Big data enables businesses to transform themselves to take advantage of the digital economy in ways that empower them to take action on the insights that their data reveals.
The key to deriving valuable business outcomes from big data is to focus on removing the barriers that exist between the people, tools and methods involved in the various stages of the analytics journey.
By capitalising on existing and emerging technologies that enable more inclusive and seamless access to big data, organisations will continue to build their “Big Data Villages” and power their business decisions with increasingly sophisticated BI insights.