learning

Developers Will Adopt Sophisticated AI Model Training Tools in 2018

By James Kobielus

Training is the make-or-break task in every development project that involves artificial intelligence (AI). Determining an AI application’s fitness for its intended use involves training it with data from the solution domain into which it will be deployed.

In 2018, developers will come to regard training as a potential bottleneck in the AI application-development process and will turn to their AI solution providers for robust training tools. Developers will adopt robust tools for training AI models for disparate applications and deployment scenarios. By the end of this coming year, AI model training will emerge as the fastest growing platform segment in big data analytics. To keep pace with growing developer demand, most leading analytics solution providers will launch increasingly feature-rich training tools.

During the year, we’ll see AI solution providers continue to build robust support for a variety of AI-model training capabilities and patterned pipelines in their data science, application development, and big-data infrastructure tooling. Many of these enhancements will be to build out the automated ML capabilities in their DevOps tooling. By year-end 2018, most data science toolkits will include tools for automated feature engineering, hyperparameter tuning, model deployment, and other pipeline tasks. At the same time, vendors will continue to enhance their unsupervised learning algorithms to speed up cluster analysis and feature extraction on unlabeled data. And they will expand their support for semi-supervised learning in order to use small amounts of labeled data to accelerate pattern identification in large, unlabeled data sets.

In 2018, synthetic (aka artificial) training data, will become the lifeblood of most AI projects. Solution providers will roll out sophisticated tools for creation of synthetic training data and the labels and annotations needed to use it for supervised learning.

The surge in robotics projects and autonomous edge analytics will spur solution providers to add strong reinforcement learning to their AI training suites in 2018. This will involve building AI modules than can learn autonomously with little or no “ground truth” training data, though possible with human guidance. By the end of the year, more than 25 percent of enterprise AI app-dev projects will involve autonomous edge deployment, and more than 50 percent of those projects will involve reinforcement learning.

(ktsdesign/Shutterstock)

During the year, more AI solution providers will add collaborative learning to their neural-net training tools. This involves distributed AI modules collectively exploring, exchanging, and exploiting optimal hyperparameters so that all modules may converge dynamically on the optimal trade-off of learning speed vs. accuracy. Collaborative learning approaches, such as population-based training, will be a key technique for optimizing AI in that’s embedded in IoT&P (Internet of Things and People) edge devices.

It will also be useful in for optimizing distributed AI architectures such as generative adversarial networks (GANs) in the IoT, clouds, or even within server clusters in enterprise data centers. Many such training scenarios will leverage evolutionary algorithms, in which AI model fitness is assessed emergently by collective decisions of distributed, self-interested entities operating from local knowledge with limited sharing beyond their neighbor entities.

Another advanced AI-training feature we’ll see in AI suites in 2018 is transfer learning. This involves reuses of some or all of the training data, feature representations, neural-node layering, weights, training method, loss function, learning rate, and other properties of a prior model. Typically, a developer relies on transfer learning to tap into statistical knowledge that was gained on prior projects through supervised, semi-supervised, unsupervised, or reinforcement learning. Wikibon has seen industry progress in using transfer learning to reuse the hard-won knowledge gained in training one GAN on GANs in adjacent solution domains.

Also during the year, edge analytics will continue to spread throughout into enterprise AI architectures. During the year, edge-node on-device AI training will become a standard feature of mobile and IoT&P development tools. Already, we see it in many leading IoT and cloud providers’ AI tooling and middleware.

About the author: About the author: James Kobielus is a Data Science Evangelist for IBM. James spearheads IBM’s thought leadership activities in data science. He has spoken at such leading industry events as IBM Insight, Strata Hadoop World, and Hadoop Summit. He has published several business technology books and is a very popular provider of original commentary on blogs and many social media.

Related Items:

Giving Machine Learning Freer Rein to Design Next-Generation Communications Protocols

Training Your AI With As Little Manually Labeled Data As Possible

Scrutinizing the Inscrutability of Deep Learning

The post Developers Will Adopt Sophisticated AI Model Training Tools in 2018 appeared first on Datanami.

Read more here:: www.datanami.com/feed/

NXP and Baidu partner on Apollo open autonomous driving platform

By Zenobia Hegde

NXP Semiconductors, the automotive semiconductor supplier and Baidu, Inc., the Chinese language search provider announced a cooperation in autonomous driving. Under the terms of the agreement, NXP will join Baidu’s open autonomous driving platform, Apollo, and provide semiconductor products and solutions including millimeter wave radar, V2X, security, smart connectivity and in-vehicle experience technologies.

First announced in April 2017, Apollo is Baidu’s open autonomous driving platform which provides a comprehensive, secure and reliable all-in-one solution supporting all major features and functions of an autonomous vehicle. Baidu refers to Apollo as the Android of the autonomous driving industry, but more open and more powerful, allowing partners to go from zero to one and quickly assemble their own autonomous vehicles and start their product R&D. Apollo has now attracted over 70 global partners.

Details of the collaboration include:

NXP will provide semiconductor products and solutions for autonomous driving, millimeter wave radar, V2X, security, smart connectivity and in-vehicle experiences
Companies will leverage the NXP BlueBox development platform’s low energy consumption, high-performance and functional safety benefits
NXP and Baidu will collaborate on sensor integration and high-performance processors for deep learning networks
Baidu’s conversational in-car system, DuerOS for Apollo, will incorporate NXP infotainment solutions for faster time to market and enhanced performance

“The automobile industry in China continues to advance at an amazing pace,” noted Kurt Sievers, executive vice president and general manager of NXP’s automotive business. “NXP is proud to collaborate with Baidu on the success of Apollo platform. We believe that NXP and Baidu have incalculable synergies to bring to the new automotive revolution.”

Li Zhenyu, general manager of Baidu’s Intelligent Driving Group, said, “The collaboration with NXP is a significant step toward the application of Baidu’s autonomous driving and connected car technologies. NXP’s entry into the Apollo platform will inject momentum into our intelligent and autonomous vehicle ecosystem, creating benefits for the intelligent vehicle industry in China and the world.”

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post NXP and Baidu partner on Apollo open autonomous driving platform appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

How Can Companies Balance Between Too Much Control and Too Much Access to Data?

By Andrew Brust

Click to learn more about video blogger Andrew Brust. The Big Data & Brews video blog series continues with host Andrew Brust, Senior Director of Market Strategy and Intelligence at Datameer. The series touches on hot topics within the business of Big Data, Analytics, Internet of Things, Machine Learning, Cloud Computing, Modern BI, NoSQL and Next […]

The post How Can Companies Balance Between Too Much Control and Too Much Access to Data? appeared first on DATAVERSITY.

Read more here:: www.dataversity.net/feed/

Cloudera and Tata Communications launch big data platform to tackle data deluge

By Zenobia Hegde

Cloudera, Inc., the modern platform for machine learning and analytics optimised for the cloud, and Tata Communications, a global provider of network, cloud, mobility and security services, announced a strategic partnership that enables enterprises to unleash the power of their data to fuel business growth.

Leveraging Tata Communications’ global cloud footprint, underpinned by the world’s largest Tier one network, and Cloudera’s machine learning and analytics capabilities offer enterprises to quickly capture, store and analyse data in various formats and across multiple sources with their managed services for Big Data offering.

IDC predicts that 163 zettabytes of data will be generated by 2025, which is ten times the data generated in 2016. This exponential growth leaves enterprises with the challenge of managing various types of data, from different sources, without busting budgets.

This solution directly tackles this challenge by enabling organisations to better structure a wide variety of types and volumes of data, transforming overhead cost into a profit centre. With this, enterprises can capitalise on their data to drive greater productivity, enhance customer experiences and spur innovation.

“Enterprises are already capturing and storing the data that could fuel their growth, if managed efficiently and effectively. With Cloudera’s leadership in the fields of machine learning and advanced analytics and our network and cloud capabilities, managed services for Big Data will help enterprises tackle this data deluge by consolidating data from all sources, both on-premise and cloud-based, into a centralised big data platform,” said Srinivasan CR, senior vice president, Global Product Management & Data Centre Services, Tata Communications.

“Our solution provides enterprises with the scale, speed and expertise to quickly transform raw data into structured and meaningful insights that address business challenges.”

Managed services for Big Data is available in dedicated or multi-tenant private cloud environments, maintaining enterprise-grade regulatory and privacy standards. With a flexible pay-as-you-use cost model, the platform allows customers to easily scale data on-demand, according to the capacity required.

The solution suite includes key features such as data lifecycle management across acquisitions, awareness and modelling, analytics and governance, managed analytics and visualisation tools and professional services to help enterprises define, design and implement their big data strategy.

Mark Micallef

“Organisations are still relying on legacy solutions to deal with today’s challenge of large and growing amounts of data. The managed services offering for Big Data helps enterprises turn this into an opportunity, with a modern and integrated big data platform that deploys machine learning and advanced analytics.

Our strategic partnership with Tata Communications empowers organisations with a solution that derives real value from data and truly enables businesses with data-driven decision-making capabilities,” said Mark Micallef, vice president, Asia Pacific and Japan at Cloudera.

This solution is hosted in Tier three datacentres in Mumbai, Singapore & United Kingdom, which is certified to TIA 942 standards and offers complete durability, with customer data replicated across big data nodes. It is backed by Service Level Agreement ISO 20000 and data management processes based on ISO 27001 standards. It also incorporates Cloudera’s ready-to-deploy Enterprise Data Hub to power fast and secure analysis of data […]

The post Cloudera and Tata Communications launch big data platform to tackle data deluge appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

With a modern data platform, any CIO can turn information into intelligence

By Zenobia Hegde

It’s been more than a decade since we first heard the phrase “data is the new oil.” But while this idea may well define the next generation of business, there’s important context surrounding it that often gets overlooked.

Since making this statement, many others have echoed these same words. What we now have come to realise is that data is a commodity, a raw material, and it’s only valuable when it can be turned into intelligence. Without the right tools to refine it, it’s just a bunch of ones and zeros. Study after study shows that while most enterprises understand the importance of data, they continue to struggle to draw real value from it, says Matt Mills, CEO and Board Member at MapR Technologies.

Amongst the most powerful and largest tech titans in the world, the idea of data turned into intelligence is gospel. It’s why companies like Apple, Amazon, Google, Facebook, and Microsoft dominate the charts these days. They not only know the value of data, but also how to transform it from raw insight into a competitive advantage.

Unlike these companies, many organisations today are using 30-year old technologies to “refine” their data and are frustrated with the little progress they are making. The simple fact is that older technologies are often too fragile and simply aren’t built for the diversity or the sheer volume of data today. The fact of the matter is this is just the beginning. Many believe that from now on, data and the digital universe will double in size every two years.

Successful data-driven companies in the early stages of their digital transformation journeys are choosing a modern data platform that is both optimised for performance today and provides the speed, scale and reliability that are required for next-generation intelligent solutions.

The modern data platform has 10 key characteristics:

A single platform that performs analytics and applications at once
Manages all data from big to small, structured and unstructured, tables, streams, or files – all data types from any source
A database that runs rich data-intensive applications and in-place analytics
Global cloud data fabric that brings together all data from across every cloud to ingest, store, manage, process, apply, and analyse data as it happens
Diverse compute engines to take advantage of analytics, machine learning and artificial intelligence
Delivers cloud economics by operating on any and every cloud of your choice, public or private
No lock-in, supporting open APIs
DataOps ready to champion the new process to create and deploy new, intelligent modern applications, products and services
Trusted with security built from the ground up
Streaming and edge first for all data-in-motion from any data source as data happens and enabling microservices natively

Many software products today can handle some aspects of modern day data platforms, but few if any can actually deliver on all of these requirements. This is where most companies get into trouble. They try to extend the software to do things it was never intended to do. These limitations are a key reason why many companies never reach their goals and objectives with data.

Here at MapR we have built the premier data platform for today — and tomorrow’s — leading enterprises. […]

The post With a modern data platform, any CIO can turn information into intelligence appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Predicting the maintenance

By Zenobia Hegde

It happens at the worst of times – late for a meeting, on the way to the rugby and even when you’re desperate for the bathroom. When your car breaks down, you can moan in retrospect, acknowledging the signs that it needed urgent maintenance. Thanks to technology, more specifically the evolution and application of cognitive learning, these frustrating occurrences will become a thing of the past.

Connecting the things

Analyst house Gartner forecasts that there will be 20.8 billion connected ‘things’ worldwide by 2020. Enterprises that stick to an old ‘preventive’ data methodology, says Mark Armstrong, managing director and vice-president International Operations, EMEA & APJ at Progress, are going to be left behind, as this approach accounts for a mere 20% of failures.

Predictive maintenance brings a proactive and resource saving opportunity. Predictive software can alert the manufacturer or user when equipment failure is imminent, but also carry out the maintenance process automatically ahead of time. This is calculated based on real time data, via metrics including pressure, noise, temperature, lubrication and corrosion to name a few.

Considering degradation patterns to illustrate the wear and tear of the vehicle in question, the production process is not subject to as high levels of interruption without the technology. By monitoring systems ‘as live’, breakdowns can be avoided prior to them happening.

It’s no longer a technological fantasy. Due to data in cars being collected for decades, researchers and manufacturers can gather insights that could be used to prepare predictive analytics. This will assist in predicting which individual cars will break down and need maintenance.

Now that the Internet of Things (IoT) is a reality, car manufacturers can use this information to offer timely and relevant additional customer services based on sophisticated software that can truly interrogate, interpret and use data. So who is going to be responsible for taking advantage of this technology?

Bolts and screws

Key management figures in the transport industry must commit to a maintenance management approach to implement a long-term technological solution. As described by R.Mobley, run-to-failure management sees an organisation refrain from spending money in advance, only reacting to machine or system failure. This reactive method may result in high overtime labour costs, high machine downtime and low productivity.

Similarly reactive, preventive maintenance monitors the mean-time-to-failure (MTTF), based on the principle that new equipment will be at its most vulnerable during the first few weeks of operation, as well as the longer it is used for. This can manifest itself in various guises, such as engine lubrication or major structural adjustments. However, predicting the time frame in which a machine will need to be reconditioned may be unnecessary and costly.

As an alternative option, predictive maintenance allows proactivity, ensuring lengthier time between scheduled repairs, whilst reducing the significant amount of crises that will have to be addressed due to mechanical faults. With a cognitive predictive model, meaning applications are able to teach themselves as they function, organisations will be able to foresee exactly why and when a machine will break down, allowing them to act […]

The post Predicting the maintenance appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Amazon, Google and technology incumbents are capitalising on AI’s potential in the smart home

By Zenobia Hegde

A new report from Navigant Research provides an analysis of the role digital assistants play in the smart home and the applications that artificial intelligence (AI) enables.

AI is fundamentally changing the way consumers interact with their homes, offering enhanced solutions, including digital assistants, that help manage energy consumption, keep homes safer and more secure, and create more intuitive home automation. Myriad opportunities exist to automate tasks and create more personal experiences with technology, especially as adoption of the Internet of Things increases and customer expectations grow. According to a new report from @NavigantRSRCH, many players recognise this potential, and large tech incumbents like Amazon, Google, Apple, Samsung, and Microsoft are quickly investing in AI and digital assistants for the home.

“Current AI technologies are being used to automate tasks, identify consumer trends, and power human-like digital assistants to make people’s lives more comfortable, convenient, and efficient,” says Paige Leuschner, research analyst with Navigant Research. “Consumer-ready AI technology isn’t about creating the human-like robots portrayed in popular media, it’s about making impactful and significant progress that delivers value to consumers.”

Indeed, despite promises of market disruption and estimates of aggressive growth, AI technology is still developing, and it remains to be seen how it will evolve in the smart home. According to the report, it is important for stakeholders involved in this space to move past the hype and continue to deliver significant, real-world value.

The report, Digital Assistants and AI in the Home, examines how machine learning, deep learning, and neural networks fall under the umbrella term of AI. The study provides an analysis of how digital assistants play a role in this technology ecosystem and the applications that AI is enabling in the smart home. It also examines the major market drivers and barriers and technology trends related to AI, as well as the key players and digital assistants.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Amazon, Google and technology incumbents are capitalising on AI’s potential in the smart home appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Cloudera: 2018 IoT Predictions

By IoT – Internet of Things

When it comes to IoT, will hybrid analytics prevail? According to Dave Shuman, IoT & Manufacturing Leader at Cloudera, in 2018 two major shifts will happen with regard to IoT. Shuman predicts we will continue to see IoT being less about connectivity and more about data analytics and machine learning. He also predicts 2018 will […]

The post Cloudera: 2018 IoT Predictions appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

Fortum deploys DigitalRoute technology to enable data driven services supporting energy sector transformation

By Zenobia Hegde

DigitalRoute, the Data Integration and Data Management company, and key European energy provider, Fortum, have jointly announced the completion of a successful Proof of Concept project. Fortum is a clean-energy company offering electricity, heating and cooling solutions as well as driving the change for a cleaner world.

Fortum has been looking for an IoT enabling platform to develop data-driven solutions and has now chosen to partner with DigitalRoute. DigitalRoute’s platform allows for real-time communication between assets, plants, data sources and business systems, resulting in optimal quality data, which can be utilised to simplify the deployment of new solutions and optimise key processes.

The recently completed Proof of Concept focused on optimising two different operational processes, both highlighting the benefits of Data Quality; to secure high quality data and then apply machine learning in order to optimise trading decisions.

The Use Case showed how DigitalRoute’s solution could be used to predict intraday prices on the Nordic Energy market using external and internal data. Direct integration between the DigitalRoute Data Optimisation Platform and market data sources enabled this provider to gain increased operational control, speed and commercial agility.

Fortum’s vice president, trading and asset optimisation, Simon-Erik Ollus, observed: “In partnership with DigitalRoute, we can increase both our momentum in the market as well as our self-sufficiency. We can focus on simplifying the deployment of new solutions and optimising business processes, enabling a fast time-to-market.”

Fortum will now utilise the experiences from the Proof of Concept in other areas such as operation and maintenance optimisation of charging infrastructure, Smart Living services as well as further solutions within energy trading.

DigitalRoute CEO Andreas Zartmann said: “Our technology is a game changer for cutting-edge utility providers and we are delighted that Fortum is working with us to demonstrate how energy companies can use it to turn innovations from an opportunity into a commercial success. We are also excited to partner with a visionary player in a fast-changing industry. Once again, this underlines the wide scope of challenges DigitalRoute’s technology can address.”

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

The post Fortum deploys DigitalRoute technology to enable data driven services supporting energy sector transformation appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Alchemy IoT introduces ‘IoT Asset Intelligence’ associated with Big Data, IoT and AI

By Zenobia Hegde

Alchemy IoT, a provider of AI-powered applications for industrial IoT, introduced “IoT Asset Intelligence,” a new industry approach that addresses the complexity and market fragmentation associated with Big Data, IoT and artificial intelligence. It outlines today’s disruptive mega-trends, and Alchemy’s industry insights and vision, before outlining practical steps organisations can use to align company goals, resources and teams toward value streams centered on industrial equipment, fleets and operational performance.

“IoT Asset Intelligence combines our best thinking into an actionable framework to plan, implement and gain value from AI-based IoT initiatives – especially for smaller organisations that may lack the resources for expensive consultants and data scientists,” said Victor Perez, CEO of Alchemy. “We are committed to the customer journey as much as the technology that is transforming the way industrial companies operate and win in the market.”

Recent studies by the International Data Corporation (IDC) estimate that by the year 2020 there will be 30 billion connected devices globally. With connected devices outputting sensor data 24 hours a day, seven days a week, the sheer scale of data production is staggering. The IoT Asset Intelligence framework provides practical steps for improving operational efficiencies, empowering better decision making, promoting collaboration and innovation, and delivering more value to the organisation.

The central theme of the IoT Asset Intelligence framework is visibility and efficiency through a blend of data-driven processes and a renewed culture around innovation and critical thinking. Like Lean and other operational improvement methods, IoT Asset Intelligence requires executive support and leadership to achieve sustainable success – as change and breaking with entrenched norms is as much a cultural “transformation” exercise as it is a process and technology play.

The following are the IoT Asset Intelligence core tenets:

Appoint an IoT champion: to be the single-point-of-contact, for the initiative and to help shift company culture.
Establish top-line goals and ROI opportunities: establish goals early in the process to have a clear path to success.
Conduct asset and process assessments: identify new data points to gain a strong understanding of what existing assets and processes exist in the organisation to better optimise them.
Identify processes for improvement and align new value streams: these processes are the best candidates for improvement.
Apply AI and machine learning for data analytics and automated workflows: not all data is good data – leverage AI to filter out irrelevant data to make data streams more powerful.
Present performance data in graphs for contextual 360-degree views: teams need to access performance data in a clear, visual representation that will allow them not only to see progress, but identify areas for improvement.
Integrate with existing ERP, EAM, MRO and MRP platforms: work with existing tools to optimise and enhance current processes, not just create new ones.
Provide graphical intelligence for executive decision-making and strategic planning: leverage data intelligence to make meaningful business decisions at the highest levels.
Practice continuous improvement: always improve – never be satisfied.

Alchemy will soon publish a comprehensive paper on the IoT Asset Intelligence framework that will be available from its website. For more information on the company, please […]

The post Alchemy IoT introduces ‘IoT Asset Intelligence’ associated with Big Data, IoT and AI appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/