BDA CONFEX

Big Data Analytics

5 November 2020

Business Design Centre, London

SOCIAL


LATEST TWEETS

Programme

Session one- enterprise strategy and roadmap for success with Intelligent driven design

  • Data governance strategies: competitive differentiation
  • Predict and optimise your business outcomes: data modelling
  • Data foundation building: business relevant analytics
  • Machine learning in production: from pilot to production
  • Deploying smart machines: developing your AI platform
  • Building trust in your data visualisation tools
  • Data visualisation: what story are you trying to tell?
09:00
Conference Chair’s Opening Address
09:15
Data governance strategies: competitive differentiation

Our opening address is centred on providing an outline of an exemplar data governance programme, with a focus on the framework and methodology that needs to be baked into, and scaled across, the enterprise.

  • A formal approach to valuing the cost of poor-quality data
  • Exposure of blind spots through the identification of vital data governance controls
  • How to prepare yourself for governing innovation, disruption and fluid strategies
09:35
Predict and optimise your business outcomes: data modelling

Data is essential, but true optimisation comes from improving performance and competitive advantage from the production of analytics models which allow for democratic access to the insights gained.

Many believe that the best approach to producing an effective model starts with the identification of a business opportunity which allows you to determine how the model can improve performance, rather than starting with the data.

We address how such an approach to modelling can generate faster outcomes and embed models in practical data relationships that are more accessible and how this approach avoids the potential of designing overly complex models which exhaust organisational capabilities

09:50
Data foundation building: business relevant analytics

As with other key data analytics investments, businesses direct a significant amount towards their IT reporting and analytics capabilities. Such efforts, however, sometimes lead to an unnecessarily diverse range of solutions from a variety of vendors which fail to align in practice. This leads to the flow of data within an organisation to not be properly utilised and leaves its potential unknown.

One of the ways in which businesses are addressing this challenge is by moving to the cloud as this provides the opportunity to simplify data logistics architecture, implement data governance and modernise digital infrastructure.

We address:

  • Simplifying the flow of data and technical architecture
  • Taking control of your data usage
  • Producing your own agile delivery method and operalisation of results
  • Assessing what cultural shift and skill set upgrade you need
10:05
Machine learning in production: from pilot to production

Production systems require machine learning models at a greater rate then ever. This has led to an emphasis on building and training models rather than a focus on how such models should be deployed and used by applications, as well as the infrastructure required to support operations.

We will highlight the process and techniques needed to support integration, from the piloting to production stage and how you can best utilise machine learning models into production systems in a comprehensive and manageable way.

10:20
Deploying smart machines: developing your AI platform

By deploying a business-relevant AI platform, you will have at your disposal a framework capable of accelerating AI projects through its life cycle at scale. This will provide you with an opportunity to work in a structured and flexible way and produce a platform upon which to create AI powered solutions. It will also allow you to deliver AI services for identified business needs as you navigate from proof of concept to production-scale systems.

We address how the right AI platform can:

  • Facilitate faster, more efficient collaboration
  • Better develop the data scientist-ML engineer relationship
  • Limit costs by avoiding duplication
  • Automate low value tasks
  • Improve reproducibility and reusability
10:35
Questions To The Panel Of Speakers
10:50
Refreshment Break Served in the Exhibition Area
11:30
Building trust in your data visualisation tools

A lot of investment is directed towards data visualisation tools due to the importance it plays in the whole data process.

Producing a viable big data visualisation which is both informative and accessible requires scrutiny of the data, an understanding of what it is you wish to convey, and an appreciation of the insights gained. Ultimately, the success or failure of a data visualisation projects starts with the decisions you make at the initial design stage.

We address the common pitfalls of data visualisation projects and how best to avoid them.

11:45
Data visualisation: what story are you trying to tell?

Data visualisations are rightly regarded as one of the most effective ways to take the insights gained from masses of data into stories which can be interpreted by all business units regardless of discipline.

That being said, data scientists are still required to manage two seemingly disparate forces; complexity and simplicity. Having the ability to create easily digestible graphs and visuals whilst ensuring a suitable level of complexity which reflects today’s data-driven demands is a must.

We address:

  • Adopting a story first approach
  • Building your data around the story you wish to tell
  • Achieving a simplicity-complexity balance
  • Securing cross-departmental buy in and feedback
12:00
Questions to the Panel of Speakers and Delegates move to the Seminar Rooms
12:15
Seminar Sessions

(To view topics see the seminars page)

13:00
Networking Lunch Served in the Exhibition Area

Session two – measuring the viability of your data, understanding your business needs and knowing how to be successful

  • Harnessing dark data: creating new value
  • Edge computing in analytics: improve response times
  • Augmented analytics: where to implement
  • IoT and digital twins: developing your skills of perception
  • Deep Learning Algorithms: putting AI to work
  • From bricks to clicks: conversational AI for the modern retailer
14:00
Conference Chair’s Afternoon Address
14:05
Harnessing dark data: creating new value

Given that roughly 90 percent of data generated by sensors and analog-to-digital conversions never gets used, it is clear that there is a vast amount of enterprise data which is not being utilised.

Many businesses believe they will utilise dark data once they have acquired better analytic and BI technology, which highlights a lack of confidence in current capabilities. By not acting to utilise dark data at speed, organisations are causing 60 percent of their data to lose value immediately.

In this session, we throw light on the concept of dark data, its challenges and explain how companies can gain insights at speed to drive new innovation.

14:20
Edge computing in analytics: improve response times

For businesses with a global reach, transferring and translating data from a wide array of geographies takes too long, is expensive and opens organisations up to privacy concerns. Rather than channelling data to a central system to carry out performance analytics, it is better to distribute and federate at the edge.

By combining edge computing with big data analytics, you will increase the speed and ease at which you can analyse vast amounts of geographically distributed data. Intelligent edge software provides near real-time insight from data whether it is seconds or years old.

14:35
Augmented analytics: where to implement

Augmented analytics complements traditional BI tools and provides us with a new paradigm in which AI and machine learning successfully augments human-led analysis of data.

As a positive disrupter, augmented analytics has been shown to transform how content is produced, ingested and shared.

We will show how successful adoption can advance the means by which organisations prepare, manage, analyse, process, mine and platform data.

14:50
Questions to the Panel of Speakers
15:00
Afternoon Networking and Refreshments served in the Exhibition Area
15:30
IoT and digital twins: life cycle management of assets and processes

Digital twins have become a must have business tool thanks to the cost-effective advancements in IoT enterprise usage. Its primary value is to be found in its ability to allow data analysts to investigate the lifecycle of an asset or process in a more tailored way.

By bridging the gap between the physical and the digital through the use of smart components, enterprises can gather massive amounts of data centred on performance testing and innovative new products. It is for this reason that digital twins have come to be regarded as the most advanced type of analytic and predictive technology available today.

We address:

  • Using digital twins to analyse, plan and prevent
  • Prevention of down time
  • Lower cost of production
  • The cross-industry sector value of adoption
15:45
Deep Learning Algorithms: putting AI to work

In an era of unprecedented data accumulation many can be forgiven for thinking that insights can only be gained through massive data sets. The reality is that this is not always the case, especially with AI data-driven platforms.

By analysing process specific data sets you can better assess the quality of the data and learn more. You can also better maintain control of the data as your model gains value and reconfigure your workflows when necessary as AI provides real time feedback by embedding itself into your relevant business processes.

We explore the value of deep learning algorithms and how best to deploy in your organisation, whether process specific or companywide.

16:00
From bricks to clicks: conversational AI for the modern retailer

In a time of great uncertainty for large scale retailers many within the sector are asking themselves how it is they can stay relevant as costumers move from the high-street to online. One of ways in which to maintain a dialogue with your customer base and provide a more engaging platform is through the adoption of conversational AI.

Big data and conversational AI provide the opportunity for old, established brands to compete with new, disruptive forces who are completely data driven.

We address:

  • How conversational AI helps improve customer service
  • Automation of staff tasks following customer queries
  • Augmenting the processes and workflows based upon queries
  • Orchestrating the activities which come from augmentation
  • Achieving competitive differentiation by measuring customer value with accuracy
16:15
Questions to the Panel of Speakers
16:35
Closing Remarks from the Conference Chair
16:45
Conference Closes, Delegates Depart

Please note:
Whitehall Media reserve the right to change the programme without prior notice.