How the Internet Of Things is affecting cybersecurity

The world of IoT devices offers consumers and businesses a wealth of benefits, but by sharing more data than ever before, are we leaving ourselves open to more cybersecurity risks than ever?

Background

The Internet of Things (IoT) is the name given to the increasingly large number of internet connected devices available to both consumers and businesses. From Amazon’s popular home assistant, the Amazon Echo, to fridges, thermostats and even cars, the number of internet connected devices available continues to grow and is thought to reach 20 billion by 2020.

The IoT has come with many benefits with formally everyday objects now able to become greater than the sum of their parts by connecting to surrounding objects, and sharing an extensive amount of data about our lives in the process.

Traditionally we may have only thought of interconnected devices in terms of computers, and later smartphones and tablets. The world of IoT is one in which just about anything can be connected and communicate in a meaningful way by sharing data to produce usable intelligence. With the IoT, the physical world is becoming one big information system, with the goal of simplifying processes and empowering individuals and businesses.

However, the more personal information and business data that exists in the cloud to make the IoT work, the more it can be exploited through the devices we are coming to increasingly rely upon. A weak link in the chain could provide hackers with nearly limitless entry points that could lead them to valuable data.

The problem

While IoT devices are undoubtedly improving our lives and businesses, they pose an increasing security threat. It’s a security threat that has already been exploited in the 2016 Mirai botnet attack that took advantage of unsecured IoT devices such as security cameras and wireless routers to unleash sweeping attacks on key internet services around the world in a massive distributed denial-of-service (DDoS) attack.

This attack and others have demonstrated that hackers can now craft attacks with unprecedented sophistication and correlate information not just from public networks, but from different private sources including our smart fridges, thermostats and cars.

Part of the issue that has left the IoT open to such vulnerabilities is the rapid pace that it has progressed with a seemingly constant stream of products coming to market from established brands and start-ups alike. In this quickly evolving world, every device made that connects to the internet is exponentially expanding the points of attack for hackers. A study by Hewlett Packard Enterprise showed that up to 70 percent of IoT devices contain serious vulnerabilities.

Cybersecurity issues with the IoT is becoming a hot topic and consumers and businesses are becoming more aware of the potential risks these devices pose. A survey by digital security company Gemalto found that 90 percent of consumers lack confidence in the security of their IoT devices and only 14 percent believe that they are extremely knowledgeable when it comes to the security of these devices. As for businesses 75 percent reported that encryption is their primary method of securing IoT assets with many also realising that they need support in understanding IoT technology and are turning to partners to help.

The solution

As more people adopt IoT as a part of everyday life at home and in the workplace, regulations are needed to ensure our safety and security. Industry and government are catching up to the concerns of consumers and businesses with a raft of recently passed legislation and guidelines to secure the future of the IoT.

The most recent legislation comes from the UK government, putting in place new measures for manufacturers to boost cyber security in millions of internet connected devices following a rise in cybersecurity breaches. Manufacturers of IoT devices will now be expected to build-in tough new security measures that last the lifetime of the product.

This comes hot on the heels of the U.S. government’s Internet of Things Cybersecurity Improvement Act of 2017 created to establish guidelines for securing devices procured by the U.S. government.

Similarly, companies are beginning to adopt and develop guidelines to ensure the secure development and deployment of IoT devices. Central to these standards are identity-focused security solutions, which can help IoT security by managing the relationships between these devices, the entities controlling them, and the data being sent and received.

One resource to help create guidelines and drive requirements for businesses to follow is the Open Web Application Security Project (OWASP), a repository of information on web applications security, which lays out cybersecurity suggestions in its IoT Attack Surface Areas Project.

Summary

With capable hackers everywhere, and their focus growing on the IoT due to the increasing flow of data around it, securing our interconnected devices and educating users to the risks has never been more important.

The cybersecurity network is adjusting to the demands of the Internet of Things with a better regulated industry and government legislation helping to minimise the threat from hackers. With some relatively simple cyber hygiene practices that stretch from the IT department to employees, organisations can stay connected and still be safe from cyber-attacks. However, we’re still likely to see bigger and more invasive attacks in the short term while we all get to grips with the risks as well as the benefits of our new interconnected world.

C5 Announces the Winners of Its Inaugural Shield in the Cloud Competition

Press release on:

C5, the investment specialist firm focused on cyber security, cloud computing and artificial intelligence, has announced the winners of its inaugural Shield in the Cloud competition. The challenge, supported by Amazon Web Services (AWS)PeaceTech Lab, and SAP NS2, was created to bring together the best and brightest minds working in anti-corruption technology.

The winners were announced at the Shield in the Cloud Awards gala, held at C5’s PeaceTech Accelerator. General Keith Alexander, founder of the United States Cyber Command, former Commander of the US National Security Agency (NSA) and current Chief Executive Officer of IronNet Cybersecurity gave the keynote speech. The General spoke about the work being done by the NSA to combat corruption, both at home in the US, and internationally. Andre Pienaar, Founder of C5 Capital, and Kurt Scherer of Booz Allen Hamilton also spoke at the event. The speakers awarded prizes in three categories: Dream Big, Not for Profit and Government.

The winners were:

1. Dream Big – Donor: Mark Labs

2. Dream Big – Local Community: MyndGenie

3. Dream Big – Global Trade: Pole Star – PurpleTRAC

4. Not for Profit: Ushahidi

5. Government: Transparency International Ukraine – ProZorro.Sale

The Special Operations Division (SOD) of the US Drug Enforcement Agency (DEA) and Citibank also received special recognition awards. The DEA were recognised for building an innovative model to combat the linkage between corruption, narcotics and terrorist finance, and Citibank for their support of global technology innovation through their Tech4Integrity programme.

Andre Pienaar, Founder, C5, said, “Shield in the Cloud is the first global innovation challenge to combat corruption and terrorist financing worldwide. The public cloud provides unprecedented opportunity to innovate, scale and strengthen good governance, everywhere. We were delighted to see the scale and pace of innovation in so many innovative applicants in the challenge. We were honoured to meet so many leaders and entrepreneurs who are determined to fight organised crime, terrorist finance and corruption to protect their countries and communities, often against the odds.”

Georgina Callander, Director, Cloud Leadership Centre, said, “There has been an incredible and uplifting number of applicants and discussions that have come out of the challenge. It’s clear that the issues that have been raised during our roundtables and interviews with key technology leaders and government officials are front-of-mind for all. This first competition has been a great success and we look forward to continuing to work in this field—not just with the winners—but with some of the high level companies that have also participated in the Challenge.”

A total of 61 organisations connected with the challenge. A panel of expert judges then created a shortlist of 24 before selecting the winners. These judges included Patrick Garver, formerly Executive Vice President and General Counsel for Barrick; Andy Powell, Head of Product Management, Eduserv; Moira Andrews, Director, Praetor Consultants; Andre Pienaar, Founder, C5 and; Kurt Scherer, Director, Booz Allen Innovation Center. The full shortlist can be seen here.

Prize winners will receive AWS cloud credits and the option of taking a place at C5’s PeaceTech Accelerator in Washington D.C. to further develop their products under the guidance of C5, AWS, SAP NS2 and PeaceTech Lab mentors.

Photos of the event are available on request, along with full profiles of each winning company.

–ENDS–

How Kimberly-Clark saved $250k with a platform powered by Tableau, Amazon Redshift, and Panoply

C5 portfolio company, Panoply, saves Kimberley-Clark money with AWS and Tableau.

Article from Tableau

We’re continually impressed by stories of customers driving innovation and making an impact with data. Kimberly-Clark, a Fortune 500 provider of personal care products, uses Tableau in conjunction with the Amazon Redshift data warehouse from AWS and Panoply (a company that automates Redshift optimization) to drive self-service retail analytics at scale.

In Europe, the company’s e-commerce data comes from multiple regions and disjointed data sources. The EMEA Analytics team is responsible for managing it, but their legacy systems did not provide the flexibility they needed to quickly address questions coming at them, which required considerable data analysis. The selection to use Tableau on Panoply has been a game-changer, saving the company $250,000 in two years and up to eight hours weekly, while also putting the power of secure data analysis in more hands. For the analytics team, the upside has been significant with more time gained to interpret the data versus spending endless hours making sense of it.

Kimberly-Clark saves thousands of dollars, without straining resources

Ask Helena Carre, EMEA Omnichannel Analytics Lead at Kimberly-Clark, about the Tableau platform running on Panoply, and she’ll tell you it’s like having “analytic superpowers.”

Carre spends most of her time immersed in data, leading the EMEA Analytics team. Kimberly Clark’s e-commerce data stems from 15 different regions, aggregated from disparate sources.

The diverse EMEA region has different SKUS and item descriptions for each retailer. The team also collects and analyzes consumer data from multiple internal sources (including data on sales and marketing spend), external sources (like SimilarWeb and Nielsen), and web applications. Combine those SKUs with siloed data and duplicate data points inherent in their legacy systems that include digital shelf analytics and a web-based platform, and their landscape is quite saturated.

It was mission-critical to find a nimble solution. Carre’s team chose to adopt Panopy’s AI-driven smart data warehousing solution in conjunction with Tableau. The powerful combination has saved the team over 400 hours in a year, equivalent to nearly a quarter of a million dollars, while enabling secure, automated data access to more professionals within the organization. They now spend less time collecting and wading through data and more time interpreting it. The combined strengths of Tableau and Panoply also gives them a ‘playground’ where they can adjust datasets for ad-hoc discovery. Carre compares “a data pro without a sandbox to a painter without a canvas”; and her sandbox is something that wasn’t available with existing solutions.

“The one-two punch of using Tableau on Panoply for fast performance was the best possible solution for my team,” says Carre. “It gave me the things I needed – speed, automation, efficiency, flexibility – without blowing up my budget, increasing my headcount, or adding unneeded complexity. It continues to be a great complement to my existing resources.”

As an added bonus, new capabilities for the team have included being able to easily compare side-by-side the performance by store format (i.e. online, in-store, express format, superstore, etc.). Examples of performance metrics now analyzed include: pricing and promotions, pack size, market share and growth.

Expanding analytic capabilities, without requiring more staff

Carre and her team support Kimberly-Clark by reporting on the latest market share, price elasticity, and consumer trends. Because of the company’s large, complex datasets, Carre often called on IT resources to help her design queries and ETL processes for business intelligence questions before she could visualize them in Tableau. While this process worked, it was not agile enough for them to quickly answer the demanding business questions being asked.

The Tableau on Panoply solution addresses the data wrangling and analytics needs faced by Carre’s team, and has saved the company an immense amount of time and headcount. For example, on just one regional retail report, the team saved eight hours per week—equivalent to $250,000 every two years, as referenced above. This report is one example of Kimberly Clark’s “huge return on investment,” bringing new capabilities without adding another Business Analyst to the team or undergoing lengthy requirements-gathering or customer development processes.

Under the hood: “Tableau on Panoply” solution

Panoply monitors queries in real-time, automatically adjusting for optimal performance. To accomplish this Panoply uses proprietary AI algorithms to learn usage patterns, optimize datasets, and cache frequently-run queries, all with the goal of improving responsiveness to the end user. Tableau and Panoply ran a joint study looking at these performance improvements with common dashboard configurations. Over the course of this three week study, they found that Panoply provided up to a 90 percent reduction in dashboard runtime.

To learn more, read the whitepaper:

Article from Tableau

Shield in the Cloud

The Peacetech Accelerator is honouring the leaders and entrepreneurs who are innovating the fight against terrorist finance, narcotics and corruption on 27 February.

How to take advantage of the rise of cloud computing in the GCC

Article by drie

How to take advantage of the rise of cloud computing in the GCC

Fuelled by plans to diversify sources of income and improve the quality of life for its citizens, the GCC is seeing a boom in the uptake of cloud computing.

Gartner expect spending on public cloud services in MENA to reach $2bn by 2020, with over 90% of enterprise workloads processed in the cloud.

The growth in platform as a service (PaaS) and software a service (SaaS) prove that companies are now moving workloads from on-premise datacentres to the cloud, and are developing new, cloud-ready, applications.

What’s behind the growth?

The recognised benefits of cloud computing — cost reduction, flexible capacity, scalability and so on — are no less important in the GCC, but it has its own, regional specific, considerations.

Oil price pressures, population growth, the demand from citizens for improved technologies and the 2030 Agenda for sustainable development, a global initiative being championed by the United Nations, means each of the GCC countries have developed their own programmes for digital transformation.

Smart Tourism, Classroom of the Future, Next Generation Care and the use of cloud first policies in the public sector are examples. Private sector companies are positioning themselves to take advantage of these initiatives and putting cloud computing at the centre of their business development plans.

But despite the attraction, moving to cloud computing isn’t without difficulties.

The challenges in adopting cloud computing in the GCC

Skills shortage

Not having access to the right knowledge, skills and experience is an obvious constraint on cloud adoption. Although there have been big improvements in teaching ICT basics as part of the school curriculum, there’s a leap from there to a skilled DevOps practitioner.

And it’s not just technology skills. Leadership is needed to help turn technology into valuable use cases, and commercial awareness is a must have for managing vendors. Defining, agreeing and managing service agreements common for cloud services isn’t straightforward.

Worries about cybersecurity

There have been several well publicised cyberattacks in the GCC that have pushed security considerations to front of mind for business and IT execs. Worries about loss of service or data theft always figure in user surveys and is the biggest practical concern for running workloads in the cloud.

Each country in the region has relatively new cybercrime laws that attract penalties for non-compliance, so companies need to have a well-constructed approach to compliance.

Fortunately, cybersecurity tools are well advanced for cloud computing and will avoid any issues if correctly applied.

Limited practical experience of cloud deployment

There’s a lot of good examples of successful cloud adoption in the GCC. Look no further than cloud-based businesses like Careem, the region’s first Unicorn, Fetchr, also growing at an explosive rate and FlyDubai, who implemented a new AWS based online check-in platform in only four months.

There’s also examples of good, well run businesses, that could benefit from a shift to cloud computing but don’t have the practical know-how to make it happen. They might have tried basic activities but haven’t tried large or technically complex projects. For many, moving legacy apps and data to the cloud can seem like a big hurdle to jump.

Regulations and laws are unclear

As the use of cloud computing increases, so too does the concern for data privacy. Current laws and regulations apply to some extent but there are so many they are difficult to interpret. They need to be easily understood to give certainty to tech companies by removing any ambiguity on interpretation or enforcement.

Fortunately, the region now recognises this. Bahrain is introducing a law on data privacy protection in early 2018, and KSA has recently completed a public consultation on the use and operation of cloud computing.

There are also industry specific rules. Dubai International Financial Centre has its own data protection regulations that are closely aligned with the GDPR being introduced in the European Union in 2018.

The perception that cloud computing costs too much

Affordability of basic infrastructure and vendor solutions has been questioned, but that’s now being fixed.

Many of the leading global tech vendors are increasing their investment in the GCC. Amazon are opening their first middle east data centres, in Bahrain, early next year and the number of local, home-grown, tech companies is increasing year on year as more start-ups appear.

Core infrastructure — data pipes and data centres — are critical for fast, cost effective, cloud operation, and the region has responded with a rolling investment programme in submarine cables for international traffic, national data network backbones and last mile infrastructure.

Four steps for adopting cloud computing

GCC countries are in different states of readiness for dealing with these challenges — Qatar, UAE and Bahrain are the top 3 cloud ready nations — and companies, apprehensive about making the first move, can help themselves by taking some straightforward actions.

1. Partner with a good vendor

Good vendors have varied experience which means they get things done fast, know the common pitfalls to avoid and will pay close attention to concerns like security.

They also have a pool of well-trained engineers who can work closely with the in-house team to cover any resource gaps, share some of their knowledge and deliver technically complex or high-risk activities.

2. Decide what workloads should be moved to the cloud

Apps that need hefty re-writing or an architecture overhaul should be avoided, at least to begin with. Find some “manageable but meaningful” applications that will test the team but not break the company.

Hybrid cloud adoption might be the answer. Business critical apps can be left with the in-house team and the more straightforward utility apps — data backup and email are examples — can be moved.

3. Plan the move to the cloud carefully

Pick the right time based on usage cycles and interdependencies. Finance apps, for example, are typically heavily used at month/quarter end and the head of finance won’t count you as a friend if you mix up their financial reporting.

Small volumes of simple data can be moved using a one-and-done approach, large volumes — petabytes or exabytes — take a lot more planning and work.

Plan for any business downtime and make sure business continuity plans are up to date. With good planning, serious issues can be avoided but it’s best to be prepared.

4. Implement, learn and improve

Once the first workload has been moved, use standard regression, performance and operational assurance test sets to check everything works ok and to confirm the integrity of data in its new location.

Use the first implementations as an opportunity to learn and improve the next time. Rigorous planning and well managed execution are the characteristics of continued success, so taking the time to embed this approach across the team will make a real difference.

At drie we aim to be your end-to-end partner on your journey to the cloud. From deciding what your cloud strategy should look like, to migrating your processes, to helping you pick the right cloud-based products that will help you scale your business. Please get in touch and we’ll arrange a chat about your cloud needs and see if we can help.

Article by drie