Identity-management market is reaching a crossroads

Many experts claim that the identity-management market is reaching a crossroads. Whilst the complexity of managing users across organisations and the security requirements to preserve privacy and prevent identity theft are increasing, IT budgets for the best technology are falling because of the continuing recession. However, although we may now be firmly in the midst of a double-dip recession, the IAM market is proving to be fairly robust and resilient. IAM is a critical component of any enterprise’s security strategy, and consequently receives higher prioritization than other enterprise technologies. Many of the major companies have indicated that they are prepared to spend as much as 8 percent of their overall security budgets on IAM. It’s estimated that the IAM market will continue to grow and could be worth as much as £10 billion by the end of 2013. The reasons for this are obvious: IAM products will continue to attract attention and investment because they are part of a critical technology that lets businesses improve and automate access management processes. Continue reading…

This year’s Big Data Analytics Conference proves to be a resounding success

Whitehall Media’s Big Data Analytics Conference 2012 proved to be a not just a great experience, but also the resounding success. The feedback received on the day from the 530 delegates and major industry sponsors attending the event was compelling and wholly positive: the organisation, the structure, the venue, the staging, the thought-provoking content and exhibition were all singled out for universal praise. The event proved to be so successful that Whitehall Media has now been asked to schedule another Big Data conference before the end of the year. Whitehall Media Group would like to thank all those who attended this stimulating and informative conference, and invite you all these to our next Big Data conference which will be held at the Victoria Park Plaza, London, on 6 December 2012. Continue reading…

Whitehall Media: Public Sector enterprise ICT and government ICT strategy, 2012

Information and communications technology (ICT) is critical for the effective operation of government and the delivery of the services it provides to citizens and businesses. The government’s ICT Strategy aims to deliver better public services for a reduced cost. It believes efficient ICT can release savings, and increase public sector productivity and efficiency. These savings are critical in order to reduce the structural deficit and continue to fund front-line services. The government maintains that its ICT strategy will enable the building of a common infrastructure, underpinned by a set of common standards. Government will work to accelerate the implementation of the strategy as part of its drive to cut down costs and improve current capabilities.

Continue reading…

The Cabinet Office announces a major milestone in the government’s ICT Strategy

The government’s Information and Communication Technology (ICT) Strategy set out the way in which the government ICT landscape would change over the current spending review period: it included 30 points of action which it believed would lay the foundations for achieving the Strategy’s core objectives of reducing waste and project failure and stimulating economic growth, creating a common ICT infrastructure, using ICT to enable and deliver change and strengthening governance. The government maintained that all these changes were necessary if the UK’s ICT structure was to be cost-effective and fit for purpose in the Twenty First Century.

Continue reading…

Whitehall Media: Public Sector enterprise ICT and government ICT strategy, 2012

IT operational efficiency is a business imperative. Optimizing the IT infrastructure improves operational efficiency by reducing costs, improving agility, and maximizing performance. The transition to the Public Services Network, which forms an integral part of the coalition government’s ICT strategy, is already well underway. The government has calculated that by 2014, Britain’s public sector could be saving £130 million or more each year through both reform and efficiency savings. Although only one year into this programme of reform, the implementation of the strategy has already resulted in the awarding of the first public service connectivity framework agreements to 12 enterprises , offered more opportunities for SMEs, particularly on the G Cloud framework, produced savings of £159.6 million on ICT contracts during financial year 2011-12, brought about dramatic improvements in the speed and ease of ICT procurement, reduced the size and complexity of ICT projects, and succeeded in making Government ICT more open and accountable to the people and organisations that use public services. Continue reading…

Coalition government launches second G-Cloud supplier framework to create more opportunity for SMEs

The government has announced the introduction of a second initiative based around the SME-friendly G-Cloud framework. The initiative will give more companies an opportunity to supply G-Cloud services through the CloudStore online catalogue. Speaking at a recent press conference, Cabinet Office minister, Francis Maude, maintained that the government was able to make this announcement because it had already made ‘significant progress’ in implementing its ICT strategy. The second phase of the G-Cloud framework will be launched because of the overwhelming response generated by the first tender. Over 600 expressions of interest were received: the Government Procurement Service subsequently awarded framework agreements to around 250 suppliers, of which around three-quarters are SMEs. Continue reading…

Industry experts claim the UK needs a more robust and effective home-centred security strategy to prevent the threat of Tier One risks like cyber attack

The UK government’s National Security Strategy sets out the key strategic choices that have to be addressed to ensure the UK’s security and resilience against acts of terrorism and hostile acts in UK cyberspace. The government’s ‘Strategic Defence and Security Review’ outlined its priorities in responding to threats against our national security, and the increasing threats to our CNI. The reports set out the 15 priority risk types that the government had identified, including four critical areas which were identified as the most important threats to national security over the next five years. These particular threats, or Tier One risks, of international terrorism, attacks on UK cyberspace, national military crises and a major accident or natural hazard, such as a pandemic were identified as priority areas of concern which warranted additional funding. Continue reading…

Nigel Sanctuary’s Big Data Blog 2, VP Cloud Propositions, Kognitio

Big Data, Cloud, Big Data, Cloud, Big Data, Cloud.  There, that’s the daily research reading done for analytical platforms and database technology. Now feel free to get on with your day.

What I mean is, is there anything new to say in this space? Well, perhaps. Can we talk about adoption? At Kognitio, we want to monitor who is addressing these issues and starting to build Big Data environments in the cloud. It’s hard to find out on the web and I am certainly not going to breach any NDAs by telling you who is looking at this end from our prospect and customer base. However, I think there are some views about adoption we can share.

In the technical space there are two fears that are delaying wholesale adoption according to Information Week and their November 2011 survey “The state of database technology”. These are virtualization of servers for scaling and how to manage security. For many, virtualization means a slowdown of performance in the database and with Big Data rapidly becoming a reality that is a real blocker. Security has so many different aspects it is hard to know which one to address first. “All of them”, I suppose is the simple answer.

Now it strikes me as obvious, that if Big Data is the subject of the moment and cloud-based technology means virtualization, you have a clash of cultures for a large portion of Database IT leaders. How are we going to sort that one out?

Amazingly, it’s actually quite simple. You find a database technology that embeds a function to virtualize hardware as a single machine and runs the database processing in parallel on this virtual environment. All you need to do is find a cloud-based provider who can offer this. Thankfully, Massively Parallel Processing on reserved instances of cloud-based computers is a reality; we have just such a service here at Kognitio. What is rarer, is one that enables the performance to be maintained with the virtualisation in operation. Most virtualized, cloud-based services don’t parallelize the database across the virtual machine environment making a single instance of the database platform from multiple servers. And, only one of them does that in-memory…in the cloud (Guess Who?)

The statistics were the part of the Information Week report that interested me. Apparently 37% have already adopted a virtualized database environment and 23% were going to in the 24 months from last September or thereabouts. Great, I thought, a ready-made market. But alas, it appears that only 7% are willing to put this architecture in the cloud. That will be the security issues, I imagine. Solvable, easily solvable, but we understand the caution. It’s a very emotive issue.

So where do we go from here. The desire is there, the platforms are there, the problem is here. Where’s the catalyst?

It’s a bit like watching those penguins coming ashore in Australia. You know the ones, where two or three penguins rush up the beach to test the safety and then rush back in to the sea. Only when this ritual has been performed a few times does a brave penguin stay and the rest rush up the beach together.

In the technology world, the equivalent of rushing up the beach is trying the demos and proof of concept facilities offered by Kognitio, for example, before settling down to lay your eggs…sorry, got carried away with my analogy there.

So come on, which of you is going to be the first penguin?

Nigel Sanctuary
VP Cloud Propositions, Kognitio

Big Data Analytics can be so confusing. This is not an infant industry, for heaven’s sake. 

Let me give you a view of my working week.

Monday: went to an online betting organization. Death would occur in the business if they could not identify the source and value of new customers.  Death, you understand. Not mere discomfort or searing pain, but death! The only antidote, apparently, is the production of visual representations of their customer flows, with leakage points and value drivers popping out of the eye candy to alert the business where to change its marketing or operational behavior.

Tuesday: Travelled far across ‘a great and impassable desert’ to an online shopping company who want to know how many customers are spending and on what, in a series of weekly reports. But, to delve deeper, they now need a modelling engine to predict and intercept activity on the website.

Wednesday: Jousting with two media buying organizations to create online trigger platforms for their seriously ‘whizzy’ model builders and bid management engines. They were challenging our ability to calculate predictions and execute them according to the latest onrush of vast and unstructured log files and tags. I showed ‘em.

Thursday: Conference calls with US Medical organizations: MAKE MY REPORTS GO FASTER!!! I WANT THEM NOW!!! NOW, YOU HEAR…..Oh……thank you. (Smug smile.)

Friday: A WebEx with an analytical services company creating alchemy in the digital space with sloppy truckloads of unrelated data and turning it in to real-time decision gold.

My point?  It’s so varied, out there.  All of it called ‘analytics’, with the data ever growing and changing, and the degree of ‘analytics’ determining the price the customer pays for the analytical platform (and the services come to that). That bit, the price, seems harder for them to grasp. How hard can it be to report data? So, our job in the industry is to simplify this for them and make the cost acceptable. We have to make it easy to collect data, make it easier to dump it in to a single place and make it straightforward to extract the bits you want for ‘analytics’. This is what I deal with day out and day in here at Kognitio; fitting diverse requirements to diverse budgets on a single platform. Luckily for me help is at hand.

Let’s take an analogy. Plastic building blocks from Denmark (I’m scared of brand names). You have a big bucket of blocks. They are in no order, you scoop a handful out and select the pieces you want and throw the rest back. Or, you peer inside hunting for the pieces you want. You build your ‘model’ (I love this analogy) and you still have a jumble of data, sorry, blocks to do more. Wouldn’t it be great though if you had something that just scooped out the right bits, really, really quickly and placed them neatly in front of you?  Well, guess what?….It exists! For data, I mean (nearly got carried away there).

End of analogy, return to the real world of Hadoop and In-Memory, Massively Parallel Processing. Big data (blocks) in a big bucket, largely unstructured, is what these customers are faced with. Rapidly expanding disparate data with as many analytical questions as there are people to think of them. The clever part is emerging in the tools and services to deal with this. Tools, such as those provided by MetaScale, will permit you to throw your data into the big bucket and haul out the bits you want, when you need them. Conveniently they hand them to an analytical platform with in-memory, massively parallel processing capability to make the ‘analytics’ fly (Kognitio). The data flies in, it zooms out and the analysts dart and dive through the data, free of traditional database bottlenecks.

So what does that mean for our customers and their analysts? Well, to begin with it means I can talk to them about something interesting, so they don’t get bored when I go and see them. But most of all they can control the budget (services like this are cloud-based), expand their data use without impacting current systems, and do ‘analytics’ (whatever that means to them) from weekly reporting to Vulcan thought programming without resort to a confusing array of systems.

Lucky them!

Nigel Sanctuary
VP Cloud Propositions, Kognitio

Day 120: What’s the collective noun for a group of Data Scientists?

120 days into my tenure leading the Greenplumbusiness in the UK and Ireland and we’ve just completed our first sponsorship of a global “Hackathon”. What a great experience. Working closely with the Data Science London community, as part of their Big Data Week, together we organized 200 plus data scientists in 10 cities to compete, over a 24 hour period, to predict the air quality in Cook County in the US. Seeing all the data scientists at Big Data Week got me thinking. What is the collective noun for a group of data scientists?Now, there is a Unix of Techies, an Accrual of Accountants, a Rash of Dermatologists and a Gazump of Estate Agents. But what about a group of Data Scientists?

Stand Up Comedians gather in a Heckle, you’ll find an Addition of Mathematicians in the university common room, and a Shortness of Jockeys in the weighing room. Dressage riders congregate in a Collection and Show Jumpers form a Crop. But what is the collective noun for a group of Data Scientists at a Hacakthon?

Is it a Regression? A Correlation? A Variance? A Coefficent? Or does a Probability fit better? Maybe a Cohort of Data Scientists focuses on the data aspect as opposed to the statistical element. Odds, Risk, Confidence or a Chi would all be candidates.

But as I watch all these individuals work together in unison who have come together as a community for this week it is the community aspect that comes through. It is clear as the groups work together that Data Science is a team sport. But more than that working with data is both science and art. Visualisation of data has reached such heights that maybe we should call them Data Artists.

So what adjective would add to the science and capture the art and the community that this profession deserves? So why not look to the world of music. To a group of people who speak or sing in unison to give life to a composition in drama or poetry recitation. And then it’s becomes clear…

A Chorus of Data Scientists

What started as an experiment for a handful of visionaries has suddenly become a force that has fundamentally changed our future. Data science is not only a competitive necessity, but a defining force shaping the evolution of every industry and every sector.

So as we plan our next Hackathon in June for the Chorus of Data Scientists in the London Data Science Community I look forward and invite you to the global gathering of Data Scientists in Las Vegas May 22nd and 23rd at the World’s 2nd Data Science Summit.

See firsthand how organizations are using data science to shift the balance of power in their markets. Stay on top of the most important trends emerging from academia. Explore how to optimize your teams and strategies to realize the full potential of data science. And engage with your peers to generate your next great idea.