Our cover story this month investigates how Fleur Twohig, Executive Vice President, leading Personalisation & Experimentation across Consumer Data & Engagement Platforms, and her team are executing Wells Fargo’s strategy to promote personalised customer engagement across all consumer banking channels

Subscribe

This month’s cover story follows Wells Fargo’s journey to deliver personalised customer engagement across all its consumer banking channels.

Welcome to the latest issue of Interface magazine!

Partnerships of all kinds are a key ingredient for organisations intent on achieving their goals… Whether that’s with customers, internal stakeholders or strategic allies across a crowded marketplace, Interface explores the route to success these relationships can help navigate.

Read the latest issue here!

Wells Fargo: customer-centric banking

Fleur Twohig, Wells Fargo

Our cover story this month investigates the strategy behind Wells Fargo’s ongoing drive to promote personalised customer engagement across all consumer banking channels.

Fleur Twohig, Executive Vice President, leading Personalisation & Experimentation across the bank’s Consumer Data & Engagement Platforms, explains her commitment to creating a holistic approach to engaging customers in personalised one-to-one conversations that support them on their financial journeys.

“We need to be there for everyone across the spectrum – for both the good and the challenging times. Reaching that goal is a key opportunity for Wells Fargo and I have the pleasure of partnering with our cross-functional teams to help determine the strategic path forward…”

IBM: consolidating growth to drive value

We hear from Kate Woolley, General Manager of IBM Ecosystem, who reveals how the tech leader is making it easier for partners and clients to do business with IBM and succeed. “Honing our corporate strategy around open hybrid cloud and artificial intelligence (AI) and connecting partners to the technical training resources they need to co-create and drive more wins, we are transforming the IBM Ecosystem to be a growth engine for the company and its partners.”

Kate Woolley, IBM
Kate Woolley, IBM

America Televisión: bringing audiences together across platforms

Jose Hernandez, Chief Digital Officer at America Televisión, explains how Peru’s leading TV network is aggregating services to bring audiences together for omni-channel opportunities across its platforms. “Time is the currency with which our audiences pay us, so we need to be constantly improving our offering both through content and user experiences.”

Portland Public Schools: levelling the playing field through technology

Derrick Brown and Don Wolf, tech leaders at Portland Public Schools, talk about modernising the classroom, dismantling systemic racism and the power of teamwork.

Also in this issue, we hear from Lenovo on how high-performance computing (HPC) is driving AI research and report again from London Tech Week where an expert panel examined how tech, fuelled by data, is playing a critical role in solving some of the world’s hardest hitting issues, ranging from supply chain disruptions through to cybersecurity fears.

Enjoy the issue!

Dan Brightmore, Editor

Our cover story investigates how the latest cybersecurity technologies ensure the Commonwealth Bank and its customers are protected from cybercrime

Subscribe

Our cover story this month charts how the Commonwealth Bank is strengthening its cybersecurity posture to protect 16 million customers

Welcome to the latest issue of Interface magazine!

Cybersecurity, and the need to share data safely and securely, goes beyond the day to day requirements of one organisation, it’s about enterprises at all levels collaborating to develop an ecosystem for the greater global good.

Read the latest issue here!

CommBank

Our cover star Memo Hayek, General Manager Group Cyber Transformation & Delivery at CommBank, is leading a team on such a journey while executing the technology transformation required to fortify cybersecurity for CommBank. Leveraging the latest cutting-edge technologies from partners including AWS and Palo Alto Networks – in demand as the global attack surface grows – Hayek is flying the flag for women in STEM careers and delivering the strategies to ensure the bank, its Australian community and the wider global economy are protected from cybercrime.

Philip Morris International

Also in this issue, we learn how Philip Morris International (PMI) is instigating a digital revolution in the travel retail sector, merging the physical and online worlds by implementing a number of CX-driven initiatives framed around PMI’s IQOS brand which is helping smokers to non-smoke products.

Valtech

We hear again from global business transformation agency Valtech on its efforts to embrace diversity across the length and breadth of its organisation to make it better able to provide solutions that touch all of society. Una Verhoeven, VP Global Technology, gives her perspective on the diversity debate and how that’s further supported in the technological evolution with the rise of composable architecture.

Digital Transformation

Elsewhere, we discover how biotech firm Debiopharm’s digital transformation journey is ushering in a new era for drug development and clinical trials. We also reveal the innovative global IT transformation plans of market-leading tile manufacturer Terreal.

Enjoy the issue!

Dan Brightmore, Editor

Peter Ruffley, Chairman at Zizo, discusses how the promise of AI…

Subscribe

The promise of AI

At present, the IT industry is doing itself no favours by promising the earth with emerging technologies, without having the ability to fully deliver them, see Hadoop’s story with big data as an example – look where that is now.

There is also a growing need to dispel some of the myths surrounding the capabilities of AI and data led applications, which often sit within the c-suite, that investment will give them the equivalent of the ship’s computer from Star Trek, or the answer to the question ‘how can I grow the business?’ As part of any AI strategy, it’s imperative that businesses, from the board down, have a true understanding of the use cases of AI and where the value lies.

If there is a clear business need and an outcome in mind then AI can be the right tool.  But it won’t do everything for you – the bulk of the work still has to be done somewhere, either in the machine learning or data preparation phase.

AI ready vs. AI reality

With IoT, many organisations are chasing the mythical concept of ‘let’s have every device under management’. But why? What’s the real benefit of doing that? All they are doing is creating an overwhelming amount of low value data. They are expecting data warehouses to store a massive amount of data. If a business keeps data from a device that shows it pinged every 30 seconds rather than a minute, then that’s just keeping data for the sake of it. There’s no strategy there. The ‘everyone store everything’ mentality needs to change.

One of the main barriers to implementing AI is the challenges in the availability and preparing of data. A business cannot become data-driven, if it doesn’t understand the information it has and the concept of ‘garbage in, garbage out’ is especially true when it comes to the data used for AI.

With many organisations still on the starting blocks, or having not yet entirely finished their journey to become data driven, there appears to be misplaced assumption that they can quickly and easily leap from being in the process of preparing their data to implementing AI and ML, which realistically, won’t work. To successfully step into the world of AI, businesses need to firstly ensure the data they are using is good enough.

AI in the data centre

Over the coming years, we are going to see a tremendous investment in large scale and High-Performance Computing (HPC) being installed within organisations to support data analytics and AI. At the same time, there will be an onus on data centre providers to be able to provide these systems without necessarily understanding the infrastructure that’s required to deliver them or the software or business output needed to get value from them.

We saw this in the realm of big data, when everyone tried to swing together some kind of big data solution and it was very easy to just say we’ll use Hadoop to build this giant system. If we’re not careful, the same could happen with AI. There’s been a lot of conversations about the fact that if we were to peel back the layers of many AI solutions, we’ll find that there is still a lot of people investing a lot of hard work into them, so when it comes to automating processes, we aren’t quite in that space yet. AI solutions are currently very resource heavy.

There’s no denying that the majority of data centres are now being asked how they provide AI solutions and how they can assist organisations on their AI journey. Whilst organisations might assume that data centres will have everything to do with AI tied up. Is this really the case? Yes, there is a realisation of the benefits of AI, but actually how it is best implemented, and by who, to get the right results, hasn’t been fully decided.

Solutions to how to improve the performance of large-scale application systems are being created, whether that’s by getting better processes, better hardware or whether it’s reducing the cost to run them through improved cooling or heat exchange systems. But data centre providers have to be able to combine these infrastructure elements with a deeper understanding of business processes. This is something very few providers, as well as Managed Service Providers (MSPs) and Cloud Service Providers (CSPs) are currently doing. It’s great to have the kit and use submerged cooling systems and advanced power mechanisms but what does that give the customer? How can providers help customers understand what more can be done with their data systems?

How do providers differentiate themselves and how can they say they harness these new technologies to do something different? It’s easy to go down the route of promoting that ‘we can save you X, Y, Z’ but it means more to be able to say ‘what we can achieve with AI is..X, Y, Z‘. Data centre providers need to move away from trying to win customers over based solely on monetary terms.

Education and collaboration

When it comes to AI, there has to be an understanding of what the whole strategic vision is and looking at where value can be delivered and how a return on investment (ROI) is achieved. What needs to happen is for data centre providers to work towards educating customers on what can be done to get quick wins.

Additionally, sustainability is riding high on the business agenda and this is something providers need to take into consideration. How can the infrastructure needed for emerging technologies work better? Perhaps it’s with sharing data between the industry and working together to analyse it. In these cases, maybe the whole is greater than the sum of its parts. The hard bit is going to be convincing people to relinquish control of their data. Can the industry move the conversation on from being purely technical and around how much power and kilowatts are being used to how is this helping our social corporate responsibility/our green credentials?

There are some fascinating innovations already happening, where lessons can be learnt. In Scandinavia for example, there are those who are building carbon neutral data centres, which are completely air cooled, with the use of sustainable power cooling through solar. The cooling also comes through the building by basically opening the windows. There are also water cool data centres out there under the ocean.

Conclusion

We saw a lot of organisations and data centres jump in head first with the explosion of big data and not come out with any tangible results – we could be on the road to seeing history repeat itself. If we’re not careful, AI could just become another IT bubble.

There is still time to turn things around. As we move into a world of ever-increasing data volumes, we are constantly searching for the value hidden within low value data that is being produced by IoT, smartphone apps and at the edge. As the global costs of energy rise, and the numbers of HPC clusters powering AI to drive our next generation technologies increase, new technologies have to be found that lower the cost of running the data centre, beyond standard air cooling.

It’s great to see people thinking outside of the box on this with, with submerged HPC systems and full, naturally aerated data centres, but more will have to be done (and fast) to meet up with global data growth. The appetite for AI is undoubtedly there but for it to be able to be deployed at scale and for enterprises to see real value, ROI and new business opportunities from it, data centres need to move the conversation on, work together and individually utilise AI in the best way possible or risk losing out to the competition.

As UK businesses look towards the cloud to enable digital innovation, more than half (58%) say the move has been…

Subscribe

As UK businesses look towards the cloud to enable digital innovation, more than half (58%) say the move has been more costly than envisaged, according to new research from Capita’s Technology Solutions division.

However, the research reveals that cloud migration (72%) remains the top transformational priority for most organisations, ahead of process automation (45%), big data analytics (40%), and artificial intelligence/machine learning (31%). This is a further indication that organisations see cloud as a core component to effectively enabling these next-generation technologies.

The From Cloud Migration to Digital Innovation’ report, which surveyed 200 UK IT decision makers, cites reduced cost (61%), improved speed of delivery (57%), and increased IT security (52%) as the main reasons for organisations to move to the cloud. However, 90% of respondents admitted that cloud migration had been delayed in their organisation due to one or more unforeseen factors. Issues such as cost (39%), workload and application re-architecting (38%), security concerns (37%), and skills shortages (35%) all point to a process that is more complicated than expected.

“Cloud adoption is a critical foundational step towards opening up real transformative opportunities offered by cloud-native technologies and emerging digital platforms and services. While some forward-thinking organisations are able to keep their eye on the goal, the complexity of the migration and application modernisation process tends to introduce delays and cost-implications that slow down progress,” said Wasif Afghan, head of Cloud and Platform at Capita’s Technology Solutions division.

A more complex and costly migration than expected

On average, those businesses asked had migrated 45% of their workloads and applications to the cloud. However, this did correlate to organisation size as organisations with more than 5,000 employees have further to go, with less than a third (31%) of workloads and applications migrated. This could be the result of having larger, more complicated systems.

Nearly half (43%) of respondents found security to be one of the greatest challenges they had faced during their migration. A lack of internal skills (34%), gaining budget approval (32%), and progressing legacy migration solutions (32%) were other significant challenges organisations had faced.

In fact, half of respondents found their organisation had to ‘rearchitect’ more workloads and optimise them for the cloud than they had expected. Further, only just over a quarter (27%) found that labour/logistical costs have decreased – a key driver for moving to the cloud in the first place.

“Every migration journey is unique in both its destination and starting point. While some organisations are either ‘born in the cloud’ or can gather the resources to transform in a relatively short space of time, the majority will have a much slower, more complex path. Many larger organisations that have been established for a long time will have heritage IT systems and traditional processes that can’t simply be lifted and shifted to the cloud straight away due to commercial or technical reasons, meaning a hybrid IT approach is often required. Many organisations haven’t yet fully explored how they can make hybrid work for them, combining the benefits of newer cloud services whilst operating and optimising their heritage IT estate,” said Afghan.

A platform for innovation

Despite some of the challenges outlined in the report, the majority (86%) of respondents agree that the benefits of cloud are compelling enough to outweigh its downsides. For more than three-quarters (76%) of organisations, moving to the cloud has driven an improvement in IT service levels, while two-thirds (67%) report that cloud has proven more secure than on-premise.

Overall, three-quarters of organisations claimed to be satisfied with their cloud migrations.  However, only 16% were ‘extremely satisfied’ – indicating that most organisations have not yet seen the full benefits or transformative potential of their cloud investments. In addition, 42% of respondents currently believe that cloud had ‘overpromised and underdelivered’.

“It’s no longer enough to think of cloud as simply a way to benefit from initial cost savings or just another place to store applications and data. Today, the move to cloud is driving a spirit of innovation right across the enterprise, paving the way for advanced digital services to be rolled out in a highly accessible, faster and more cost-effective way – whether that’s AI, RPA, complex data analytics or machine learning. Only through the alignment of IT and lines of business leadership – in terms of goals, vision, direction and mindset – can organisations fully unleash the potential of cloud to address their key business objectives, whether that is improving business agility, delivering an enhanced customer experience or enhancing business efficiencies.” said Afghan.

The ‘From Cloud Migration to Digital Innovation’ report can be download here https://go.capita-it.com/cloud-research-report.

Experts have been predicting for some time that the automation technologies that are applied in factories worldwide would be applied…

Subscribe

Experts have been predicting for some time that the automation technologies that are applied in factories worldwide would be applied to datacentres in the future. Not only to improve their efficiency but to help gather business insights from ever-increasing pools of data. The truth is that we’re rapidly advancing this possibility with the application of Robotic Process Automation (RPA) and machine learning in the datacentre environment. But why is this so important?

At the centre of digital transformation is data and thus, the datacentre. As we enter this new revolution in how businesses operate, it’s essential that every piece of data is handled and used appropriately to optimise its value. This is where the datacentre becomes crucial as the central repository for data. Not only are they required to manage increasing amounts of data, more complex machines and infrastructures, we also want them to be able to generate improved information about our data more quickly.

In this article, Matthew Beale, Modern Datacentre Architect at automation and infrastructure service provider, Ultima explains how RPA and machine learning are today paving the way for the autonomous datacentre.

The legacy datacentre

Currently, businesses spend too much time and energy on dealing with upgrades, patches, fixes and monitoring of their datacentres. While some may run adequately, most suffer from three critical issues;

•           Lack of consistent support, for example, humans make errors when updating patches or maintaining networks leading to compliance issues.

•           Lack of visibility for the business, for example, multiple IT staff look after multiple apps or different parts of the network with little coordination of what the business needs. 

•           Lack of speed when it comes to increasing capacity or migrating data or updating apps.

Human error is by far the most significant cause of network downtime. This is followed by hardware failures and breakdowns. With little to no oversight of how equipment is working, action can only be taken once the downtime has already occurred. The cost impact is much higher as the focus is taken away from other things to manage the cause of the issue, combined with the impact of the actual network downtime. Stability, cost and time management must be tightened to provide a more efficient datacentre. Automation can help achieve this.

‘Cobots’ make humans six times more productive

Automation provides ‘cobots’ to work alongside humans with unlimited benefits. The precisely structured environment of the datacentre is the perfect setting to deploy these software robots. There are many medial, repetitive and time intensive tasks that can be taken away from users and given to a software robot with the effect of boosting both consistency and speed.

Ultima calculates that the productivity ratio of ‘cobot’ to human is 6:1. By reviewing processes that are worth automating, software robots can be programmed, and once verified, they can repeat them every time. Whatever the process is, robotics ensure that it is consistent and accurate, meaning that every task will be much more efficient. This empowers teams to intervene only to make decisions in exceptional circumstances.

The self-healing datacentre

Automation minimises the amount of time that human maintenance of the datacentre is required. Robotics and machine learning restructures and optimises traditional processes, meaning that humans are no longer needed to perform patches to servers at 3 am. Issues can be identified and flagged by machines before they occur, eliminating downtime.

Re-distribution of resources and capacity management

As the lifecycle of an app across the business changes, resources need to be redeployed accordingly. With limited visibility, it’s extremely difficult, if not impossible, for humans to distribute resources effectively without the use of machines and robotics. For example, automation can increase or decrease resources accordingly towards the end of an app’s life to maximise resources elsewhere. Ongoing capacity management also evaluates resources across multiple cloud platforms for optimised utilisation. When the workload is effectively balanced, not only does this offer productivity cost savings, it also allows for predictive analytics.

The art of automation

These new, consumable automation functions are the result of what Ultima has already been doing for the last year when it found itself solving similar problems for three of its customers. It was moving three customers from their end of life 5.5 version of VMWare and recognised that it would be helpful to be able to automatically migrate them to the updated version, so it developed a solution to do this. Where once it would have taken 40 days to migrate workloads, the business cut that in half, resulting in a 33 per cent cost saving for those companies. It then moved on to looking at other processes to automate with the ambition of taking its customers on a journey to full datacentre automation.

Using discovery tools and automated scripts to capture all data required to design and migrate infrastructure to the automated datacentre, Ultima’s infrastructure is used as a code to create repeatable deployments, customised for customer environments. These datacentre deployments are then able to scale where needed without manual intervention.

The journey to a fully automated datacentre

The first level of automation provides information for administrators to take action in a user-friendly and consumable way, moving to a system that provides recommendations for administrators to accept actions based on usage trends. From there automation leads to a system that will automatically take remediation actions and raise tickets based on smart alerts. Then you move to a fully autonomous datacentre utilising AI & ML, which determines the appropriate steps and can self-learn and adjust thresholds.

AI-driven operations start with automation

Businesses are adopting modern ways of consuming applications as well as modern ways of working. Over 80 per cent of organisations are either using or adopting DevOps methodologies, and it is critical to the success of these initiatives that the platforms in place can support these ways of working while still keeping efficiency and utilisation high.

In the not too distant future is a central platform to support traditional and next-generation workloads which can be automated in a self-healing, optimum way at all times. This means that when it comes to migration, maintenance, upgrades, capacity changes, auditing, back-up and monitoring, the datacentre takes the majority of actions itself with no or little assistance or human intervention required. Similar to autonomous vehicles, the possibilities for automation are never-ending; it’s always possible to continually improve the way work is carried out.

Matthew Beale is Modern Datacentre Architect, Ultima, an automation and transformation partner. You can contact him at matthew.beale@ultima.com and visit Ultima at www.ultima.com

By Lee Metters, Group Business Development Director, Domino, “Get closer than ever to your customers. So close, in fact, that you…

Subscribe

By Lee Metters, Group Business Development Director, Domino,

“Get closer than ever to your customers. So close, in fact, that you tell them what they need well before they realise it themselves.” Steve Jobs

Every brand aspires to get close to its customers to understand what makes them tick. Those that succeed invariably deliver better experiences that inspire long-term loyalty. Today, the world’s biggest brands know us so well they’re able to personalise their marketing to match our individual tastes and behaviours. When Netflix recommends you try Better Call Saul, it’s because it knows you binge-watched Breaking Bad. The personal approach works; whether it’s a Netflix notification or a ‘programmatic playlist’ from Spotify, targeted recommendations – informed by deep learning and vast data – hugely influence the content we stream. Steve Jobs was right: successful brands get so close to their customers, they can tell them what they need long before they know they need it. And we all keep coming back.

However, not all brands are as fortunate as the digital disruptors. How do you get close to your customer when your brand isn’t an online service that’s routinely capturing user data? If you’re marketing a physical entity – a food, a toy, a designer handbag or a male grooming kit – how do you even know who your customers are (let alone what they need) when complex supply chains inevitably separate you from your end-user? How can you add brand value when you can’t build a direct relationship with your customer or lay the foundation for long-term engagement? The answer is: you can. In fact, as Lee Metters, Group Business Development Director, Domino, examines, with the advent of simple, affordable technology, you can do it quickly, easily, and cost-effectively. 

New opportunities

A convergence of factors is creating new opportunities for marketers to transform the way they manage their brands through the consumer lifecycle. The availability of personalised barcodes combined with the ability of smartphones to read them, has reinvented consumer behaviours, with shoppers increasingly scanning product barcodes to discover more about the brands they buy. However, until recently, the absence of standardised coding meant that brands needed to create proprietary apps to deliver their value-added features, relying on customers’ willingness to download ‘yet another app’ in a world of app fatigue. The introduction of GS1 Digital Link barcodes, which provide a standards-based structure for barcoding data, has removed this need for product-specific apps. It’s opened up the potential for marketing innovation – such as digitally activated campaigns that can transform a product into an owned media channel – enhancing the brand experience and building stronger connections with customers. This key development has been assisted by the emergence of advanced coding and marking systems that are helping brands include more information on every product, allowing them to personalise customer experiences at speed and scale.

With customer intimacy considered a key driver of commercial success, personalised coding and marking can help brands achieve the Holy Grail of getting closer to their customers. What’s more, it provides a platform for value-added innovation that builds engagement, trust, and long-term brand loyalty. The potential applications are exciting and wide-ranging. 

Internet of Products

Digital innovation is not limited to online brands – practically every product can form part of a connected and accessible online ecosystem. An internet of products. In its simplest form, personalised barcoding can provide a gateway to online content – user manuals, product details, blogs, communities, and customer support – that enhances the brand experience. However, beyond the basics, the opportunities for compelling customer engagement go much further. Leading brands are using QR codes to trigger anything from loyalty schemes and competitions to gamification and immersive brand experiences. Progressive brands are using barcodes to create innovative gifting solutions – allowing customers to record personal video messages to accompany their presents, giving their loved ones a more memorable experience.

The potential for innovation is significant – and the rewards are too. For example, in Germany, Coca-Cola used barcoding on cans and bottles to engage directly with consumers, with a simple scan connecting customers with ‘in the moment’ mobile experiences. The digitally activated campaign allowed Coca-Cola to transform its products into an owned media channel, captivating customers with personalised content, incentives, and competitions that generated unprecedented brand engagement. The campaign has subsequently been rolled out across 28 markets in Europe and North America.

Provenance and authenticity

Serialisation, first introduced to safeguard the medicines supply chain against the plague of counterfeit drugs, is now being widely applied across many industries – allowing brand owners and customers to track and trace products and determine their authenticity. This is a significant value-add in sectors like food, where discerning consumers are increasingly interested in the provenance of produce, and the journey foods make from farm to fork. With carbon footprint and other environmental issues now a key influence on consumer purchases, traceability is a major value-add across most commercial industries. 

The value of data

Barcode innovation undoubtedly provides considerable value for consumers. With research showing that customer experience is the most competitive battleground in consumer markets, qualities such as transparency, social responsibility, and open engagement are all crucial ingredients in a trusted brand experience where personalised barcoding can help. But the value exchange isn’t all one way: marketers benefit too.

Direct link barcodes provide a mechanism to capture a rich seam of real-time data that can help brands understand – and respond to – customers’ needs. Simple information such as user profiles, geo-location, purchase history, dates, and times can be leveraged to build a dynamic picture of individual customers, helping to inform a wide range of services and communications. This data can provide a powerful marketing platform – an organic and automated CRM – to target customers and personalise communications based on identifiable preferences and behaviours. Marketers can understand customers’ buying cycles to trigger timely and relevant alerts. They can upsell products and accessories, nudge customers when warranties expire, or past purchases are getting old and tired. And just like Netflix, they can recommend new products that customers will love – long before they know they need them.

Cracking the code

The emergence of GS1 Direct Link barcodes – and the smart technologies that support them – is transforming the retail experience, helping consumers find out more about the products they buy and bringing brands much closer to customers. As the High Street battles tough economic conditions and the rise of digital disruptors, the successful brands of tomorrow will be those that exploit the creative opportunity of personalised barcoding and deploy advanced coding and marking systems that make the magic happen.

It’s time to crack the code.

As location data continues to dictate customer interactions, Tableau Software redefines the data driven conversation, following the unveiling of its…

Subscribe

As location data continues to dictate customer interactions, Tableau Software redefines the data driven conversation, following the unveiling of its latest next-generation mapping capabilities that will enhance how people anaylse location data.

With the general availability of Tableau 2019.2 now live, the company’s product offering allows for greater understanding of location data through mapping technology. The latest release utilises Mapbox mapping technology to implement vector maps that allow people to see more detailed location data and perform greater analysis.  The newest version also includes parameter actions for more visual interactivity.

This comes at a key time in the location data conversation, as recent reports indicate that by 2022, 30% of customer interactions will be influenced by real-time location analysis.  Tableau can now provide a more efficient and smoother experience as well as provide far greater background mapping layers to geospatial data, including train stations, building footprints and terrain information.

PATH, a global health organisation that uses Tableau and Mapbox to monitor reported cases of diseases more easily and precisely keep tabs on communicable diseases in hot spots, will see key benefits from these new geospatial capabilities.

“Monitoring the reported cases of diseases like malaria will be enhanced greatly by accurately placing those cases on a map. As visualisation tools, maps engender a sense of both place and scale. They also instigate exploration and discovery, so decision makers can see where diseases are emerging and make comparisons to where they have available resources such as health facilities, drugs, diagnostics or community health workers.” said Jeff Bernson, Vice President, Technology, Analytics and Market Innovation at PATH. “By adding more accurate and detailed vector mapping into our work with Tableau through initiatives like Visualize No Malaria, our country partners can more easily and precisely keep tabs on communicable diseases in hot spots, and get help to those who need it faster.”

Tableau 2019.2 follows the recent introduction of its Ask Data platform. Revealed earlier this year, Ask Data uses the power of natural language processing to enable people to ask data questions and get an immediate visual response.

“Tableau’s unparalleled community inspires and motivates our rapid pace of innovation. With every release, we are working to simplify and enhance the analytics experience so that even more people can easily ask and answer questions of their data,” said Francois Ajenstat, Chief Product Officer at Tableau. “From empowering new analytical creativity with parameter actions, to unlocking the power of spatial data through a richer, more advanced mapping experience, Tableau 2019.2 takes interactivity to the next level for our customers.”

You can find out more information on Tableau 2019.2 and a full breakdown of its features at tableau.com/new-features

By Elif Ecem Seçilmiş, an Associate at Kılınç Law & Consulting The EU’s General Data Protection Regulation (GDPR) was created…

Subscribe


By Elif Ecem Seçilmiş, an Associate at Kılınç Law & Consulting

The EU’s General Data Protection Regulation (GDPR) was created with the aim of homogenising data privacy laws across the EU. GDPR also applies to organisations outside the EU, if they monitor EU data subjects, or offer goods and services to them. The GDPR applies to personal data, which is defined as any information relating to an identifiable natural person.

In certain cases, frameworks such as the EU-US Privacy Shield have been implemented to ensure the protection of data being transferred outside the EEA. However, such frameworks have not been established in all countries outside of the EEA. In such cases, businesses need to be keenly aware of the data protection laws in each territory, in order to ensure compliance.

Businesses based within the EEA that wish to send personal data outside the EEA also need to pay particularly close attention to GDPR. GDPR restricts the transfer of any personal data to countries outside the EEA.

The European Commission has made “adequacy decisions” as regards the data protection regimes in certain territories.  Territories, where the data protection regime has been deemed adequate, include Andorra, Argentina, Guernsey, Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay. The EU Commission has also made partial findings as regards the adequacy of the regimes in the US, Japan and Canada.

If a business wishes to send data to a country that is not in the EEA, and which is not covered by an “adequacy decision”, it will need to ensure that the appropriate safeguards set out in the GDPR are implemented.

In order to facilitate data transfers within multinational corporate groups, “binding corporate rules” may be submitted to an EEA data supervisory authority for approval. If these are approved, then all members of the group must sign up to these rules and they then may transfer data outside the EEA, subject to the binding corporate rules.

Another way to make a restricted transfer outside the EEA is for both parties to enter into a data-sharing agreement, which incorporates the standard data protection clauses adopted by the European Commission.

The Commission has published four sets of such model clauses, which set out the obligations of both the data exporter and data importer. The clauses may not be amended and must appear in the agreement in full. The penalties for non-compliance with GDPR are significant since organisations can be fined €20 Million or 4% of their annual global turnover for breaches.

Article 49 of GDPR also sets out derogations from the GDPR’s general prohibition on transferring personal data outside the EEA without adequate protection. The derogations can apply, for example, where there is an important public interest, or the data must be transferred for legal proceedings. A derogation can also apply where the data subject has been fully informed of the risks but has given their explicit consent to the transfer.

The advent of GDPR has significance for companies doing business internationally. However,  companies doing business internationally also need to think beyond GDPR. Companies may find themselves subject to the data protection regimes of third countries, even if they do not have any physical presence there. For example, international companies without a presence in Turkey may be subject to Turkish data protection law if their activities have an effect in Turkey. 

A registration system for data processors is currently being rolled out in Turkey. Data processors based outside Turkey whose activities have an effect in Turkey may need to register by 30 September 2019.

Turkey’s 2016 Law on the Protection of Personal Data is based largely on EU data protection law. As a candidate state for EU membership, Turkey aligns much of its legal system with EU law. Many of its requirements are broadly similar to EU law. However, there are also some very important differences which companies whose businesses have an effect in Turkey should be mindful of.

Turkish data protection law allows for administrative fines of up to three per cent of a company’s net annual sales to be levied if personal data is stolen, or disclosed without consent.  Turkish data protection law applies to both sensitive and non-sensitive personal information.

Personal data may not be transferred outside Turkey without the consent of the data subject, except in strictly limited circumstances. Regulatory approval is required for such transfers where the transfer may harm Turkey or the data subject.

Unlike GDPR, however, “explicit consent” is required by Turkish Law to process both sensitive and non-sensitive data. The exceptions to this general rule include where there is a legal obligation on a data processor to process the data, and where such processing is necessary to protect the life of the subject. Further processing is not allowed without specific consent, and there is no “compatible purpose” exception in Turkish law. The definitions of consent also differ in Turkish law and under GDPR.

GDPR has caused many EEA companies to consider in detail the laws restricting the transfer of data out of the EEA. However, companies may also be subject to laws restricting the transfer of data into the EEA.

Elif Ecem Seçilmiş is an Associate at Kılınç Law & Consulting

According to an Accenture study, 79% of enterprise executives agree that companies not embracing big data will lose their competitive…

Subscribe

According to an Accenture study, 79% of enterprise executives agree that companies not embracing big data will lose their competitive edge, with a further 83% affirming that they have pursued big data projects at some point to stay ahead of the curve. Considering that data creation is on track to grow 10-fold by 2025, it’s crucial for companies to be able to process it more quickly, and meaningfully.

Part of the latest in the stream of buzzwords, “big data” gets thrown around in business and tech circles like everyone truly understands it, but do they really? Big data is the label for extremely large data sets that can be analysed and provide insights around trends and patterns to influence better business decision making. 

That may sound simple enough, and although lots of information is available about big data technologies, few have actually mastered the art of using big data to its full potential. In a survey undertaken by Capgemini, just 27% of executives surveyed described their big data initiatives as ‘successful’, reinforcing that while many are talking about it and ambitions around it, many businesses still have much to learn

Implementing effective, fast data processing can guarantee that your company continues to be successful, and is only growing in importance with the diverse, and large, amounts of data that businesses produce. While this can be seen as daunting, it actually gives us all the ability to analyse more innovatively.

Coupled with the growing dominance and capabilities of cloud computing, now is the perfect time to really take a look into “big data analytics” so you too can recognize how the power of crunching big data is bringing competitive advantage to companies.

Big data and cloud computing – a perfect pair

Data processing engines and frameworks are key components in computing data within a data system. Although there is no key difference in the definition between “engines” and “frameworks,” it’s important to define these terms separately — consider engines as the component responsible for operating on data while frameworks are typically a set of components that are designed to do the same.

Although systems designed to handle the data lifecycle are rather complex, they ultimately share very similar goals — to operate over data in order to broaden understanding and surface patterns while gaining insight on complex interactions.

In order to do all this however, there needs to be infrastructure that supports large workloads – and this is where cloud comes in. Clouds are considered a beneficial tool by enterprises across the world because they have the ability to harness business intelligence (BI) in big data. Also, the scalability of cloud environments makes it much easier for big data tools and applications, like Cloudera and Hadoop, to function.

Programming frameworks available to find the right fit

Several big data tools are available, and some of these include:

Hadoop: This Java-based programming framework supports processing and storage of extremely large sets of data. This is an open source framework and is part of the Apache project, sponsored by Apache Software Foundation, which works in a distributed computing environment. Hadoop supporting software packages and components can be deployed by organizations in their local data centre.

Apache Spark: Apache Spark isa fast engine used for big data processing that is capable of streaming and supporting SQL, graph processing, and machine learning. Alternatively, Apache Storm is also available as an open-source data processing system.

Cloudera Distributions: This is considered one of the latest open-source technologies available to discover, store, process, model, and serve large amounts of data. Apache Hadoop is considered part of this platform.

Hadoop on CloudStack to Crunch Data Effectively

Hadoop, which is modelled after Google’s MapReduce and File System technologies, has gained widespread adoption in the industry. This framework is similar to CloudStack and is implemented in Java.

As the first ever cloud platform in the industry to join the Apache Software Foundation, CloudStack has quickly become the logical cloud choice for organisations that prefer open-source options for their cloud and big data infrastructure.

The combination of Hadoop and CloudStack is truly a brilliant match made in the clouds. Considering the availability of big data tools like these, working in the cloud to leverage meaningful BI, now is really the perfect time to harness the power of big data to truly drive your business forward.

Lesley Holmes Data Protection Officer at leading HR and payroll provider MHR gives a valuable insight into the future of…

Subscribe

Lesley Holmes Data Protection Officer at leading HR and payroll provider MHR gives a valuable insight into the future of technology and how the axis of power may sway towards tech leaders.

A phrase I hear a lot is that ‘data is the new oil’, in reference to data as an extremely valuable commodity, which is increasing in value year by year and may well one day have a similar value to fossil fuels.

If data is the new oil, then the people controlling the data must be the new oil barons, maybe even becoming even more powerful than individual oil barons at some point in the future, as they are not tied to set geographical areas for ‘mining’ and will never run out of new data.

Oil prices in the global marketplace are controlled by a handful of people, yet the decisions they make have a huge impact on world economies, so the power of data might just create a similar group of digital oligarchs.

I feel that the use of ‘data-mining’ by these individuals can be used in several ways:

  • For the public benefit.
  • For the benefit of a particular organisation using its own collated data.
  • For the purposes of monetisation or to influence outcomes through targeted marketing.

Public Benefit.

Most people understand that data can benefit us all in various ways, like anti-terrorism work and to detect other crimes, through the use of statistics, or using CCTV footage to log crimes.

Governments also collate data from both public and private sources to help plan public services better and prevent economic, social and environmental issues, by identifying data trends.

Data can also be used for things like medical research, or to gauge public opinion and is often done by public bodies with the public interest at heart, so data isn’t used directly for profit; the research is done to benefit us all.

An organisation using its own collateral.

Organisations can gather their own data, in accordance with their privacy notice, which will make clear what they are doing and why (in most cases anyway!).

They use this data to improve the services they offer, work out the effectiveness of their marketing and plan their workforce; not to mention informing strategies for performance and profitability.

Data also has specific uses, like assessing actuarial risk in the insurance industry, with the aim of providing a better service based on strong data, so we get a better quote if we are low risk customers, so there are many positives to gathering data.

Aside from using data to assist customers, organisations can use data they hold on their own employees for purposes which help the business, like monitoring performance trends, absence management and workforce optimisation.

Besides the obvious benefit of using company data to build a better business, organisations over a certain size are required to produce reports for the government. An obvious example of this in recent memory, was the introduction of Gender Pay Gap reporting, part of a wider investigation into equal pay in the UK, taking personal data and anonymising it for reporting purposes. There is debate over whether this data might be misused and encroach on personal freedom, but that’s a discussion for another day…

For the purposes of monetisation.

In the last year there has been a huge list of articles written which illustrate the risks of big data when misused, most notably the Facebook/Cambridge Analytica data breach, but this isn’t an isolated event. Just like the oil barons discussed at the beginning of this article, many other companies are extracting and refining your personal data like oil for massive profits.

Data is already taking a sinister turn.

Hidden cameras are now being used which implement facial detection software to establish which adverts shoppers like best. As they walk through shopping centres, the cameras gauge the reaction to each advert, changing these when the reaction is a negative expression.

While this seems like a great advance in technology, there is an issue.

These technologies use facial detection (capturing a blurry image), rather than true facial recognition, but the quality of data is sufficient to distinguish gender with 90% accuracy, age to within five years and mood range (from very happy to very unhappy) to around 80% accuracy. In many countries this happens without consent, or even customer knowledge, which is a worrying trend.

This shows the world is changing.

The recent discussion around facial recognition technologies suggest these will be exploited further to enhance the customer experience. This will come through utilisation of ATM identity verification and hotel check-in processes, designed to increase customer satisfaction while reducing employee demand.

Behind the scenes, data-sets are manipulated and combined to identify trends, forecast spending patterns, and other activities which lead to profits; including the use of personal data for commercial purposes – such as drug trials by companies hoping to create expensive products from the data they gather.

Facebook of course allowed an app to harvest millions of data items to target content which may have created political sway, which demonstrates the power of the tech companies to influence political and social outcomes.  There is much speculation about how harvested data has been used in the political environment and who knows? We may ourselves have been influenced by such data.

For the prevention and detection of crime.

Data, personal and otherwise, has been used for years to help prevent and detect crime. The use of forensic techniques started in China in the 700’s when fingerprints were starting to be used, but the most significant breakthroughs came in the last century with the creation of dedicated teams to deal with this area of investigation.

Now the Chinese again lead the way with facial recognition being used to identify and capture criminals as they move around the major cities. With the largest number of CCTV cameras, China is probably embracing the technology for more than just policing.

So what are the dangers?

What’s clear is that these ‘data barons’ can use the data for good, but they will be (and perhaps already are) so powerful that anything other than the most scrupulous data usage has the potential for disastrous societal issues.

Objection to overzealous state control has resulted in everything from strongly worded literature to violent protests, but at least governments can be held accountable, and we know who’s in charge.

The clandestine nature of the internet means that some of the most powerful public figures in future will not be public at all, just pulling the strings through the data-wells they possess.

What’s clear is that we need to establish a way of controlling the use of data, or we lose control of everything else.

By Johnny Carpenter, Director of Sales EMEA, iland If you serve on the board of a UK organisation, it’s likely…

Subscribe

By Johnny Carpenter, Director of Sales EMEA, iland

If you serve on the board of a UK organisation, it’s likely that digital transformation is high on your agenda as you look strategically at futureproofing your business. A key part of that is ensuring that the IT infrastructure supporting your company is functioning robustly as a platform on which to build competitiveness, rather than a legacy anchor holding back innovation and growth. Moving to an Infrastructure-as-a-Service (IAAS) set-up is increasingly the way that companies aim to unlock potential and enable more dynamic, flexible business processes.

The benefits of IAAS are clear: It’s flexible and can easily scale as your business grows. It removes the burden of maintaining legacy systems and allows the easy deployment of new technology and, ideally, you only pay for what you use on a predictable opex basis; you won’t be paying to maintain capacity that is rarely needed. It also allows you to add on services such as analytics and disaster recovery-as-a-service and it’s the perfect environment for the big data projects requiring large workloads and integration with business intelligence tools.

All these drivers mean that boards can be under pressure to quickly sign off on cloud migration projects. However, it could be a case of more haste, less speed if boards don’t ask the right questions before they sign on the dotted line. It’s important that decision makers don’t simply view IAAS as a commodity purchase – there are a range of providers from hyperscalers to vertical sector specialists and they’re not all the same. Boards must undertake due diligence when making the IAAS decision and there are some key questions that should be asked to ensure that the project delivers both the operational and also the strategic outcomes required.

What’s the scale of our ambition and what business outcomes do we want to see?

We tend to see cloud migration projects falling into one of two camps. In the first, businesses simply want to “lift and shift” their current operations and replicate them exactly in a cloud environment. Naturally they want to see the benefits of cost and flexibility, but fundamentally they want a similar experience after the migration to what they had before. The second scenario sees companies wanting to fully overhaul their infrastructure and deliver a completely different model back to the business – more of a true digital transformation.

It’s important to know which camp you’re in and be sure that your prospective IAAS provider is aligned, because in either case, ending up with the alternative scenario will cause pain. What should be a straightforward process becomes overly complicated when the destination is not clear from the outset.   

How much support do we require at onboarding and ongoing?

Support for the initial cloud migration varies between providers from do-it-yourself to a full concierge migration service.

If you opt for a hyperscale provider, you’ll find the approach is more on the DIY side – there are a wealth of options but it’s up to you to figure out what’s best for your business and mix and match accordingly. This works if you have in-house capability or are happy to employ consultancy expertise in order to manage the move.

At the other end of the scale are providers offering an end-to-end concierge service to get you up and running with onboarding, deployment and testing. Your IT team will be expected to bring their existing skillsets, but little additional learning is required.

In both cases, you also need visibility of the ongoing costs associated with support for your cloud environment and the availability of that support.

What are our security and compliance requirements and how will they be managed in the cloud?

Managing risk is a significant board responsibility that only increases as regulations tighten. Company data is one of the most high-risk assets the business possesses and its safety in the cloud has to be beyond reproach. Prospective CSPs should be able to provide assurances of the security offered by their cloud that meet or ideally exceed the organisation’s compliance requirements.

Assurance at the start is one thing, but ongoing auditing and reporting is also critical. The GDPR, for example, requires that organisations demonstrate how they are taking steps to protect data on a continuous basis and you’ll need to work with your CSP to achieve this.

Again, offerings differ. Some providers will expect you to take responsibility yourself, bringing your own security and compliance team, software and processes with you. Others, including iland, have built a dedicated practice around compliance that is at the disposal of customers. This can be invaluable if your compliance team is small or you don’t have in-house support. Either way, it’s another important consideration when adopting IAAS.

Pricing – How flexible is flexible?

 The lure of only paying for the resources you use is a powerful motive for moving to IAAS. Whichever provider you choose, it is likely to be more cost-effective than your legacy environment, but to really reap the full economic benefits, you need to ensure that there’s a good match between cloud workloads and cloud resource utilisation.

Some providers will allow you to reserve cloud resources based on exactly the amount of GB required, with billing based on actual compute usage, while other work on a “best fit” basis, offering a range of predetermined instance sizes. There is a risk here of paying for resources you don’t use, so it’s important to check that your requirements are close to the instance size selected. You also need to ensure that you understand the billing system and have visibility over any additional costs such as VPNs or burstable charges that might be incurred. You certainly don’t want any nasty surprises further down the line.

Fundamentally, adopting infrastructure-as-a-service is a sound decision, but it still needs careful scrutiny to make sure the business gains the maximum benefits possible. Even though boards are under pressure to sign off deals, they should ask the right questions to make sure their investment delivers the business outcomes they’re looking for.