IPv6 and IoT News

‘Unclonable’ security ICs with ChipDNA technology launched to protect IoT customers

By Zenobia Hegde

Designers can now proactively and inexpensively protect their intellectual property (IP) and products with a solution that is claimed to be immune to invasive physical attacks.

So says Munich-based Maxim Integrated Products, Inc which has launched the DS28E38 DeepCover® secure authenticator. “Security can be complicated,” says Don Loomis, vice president of Maxim’s Micros, Security & Software Business Unit, “but avoiding it is costly.”

It’s a fair start point when assessing security in the Internet of Things, says Jeremy Cowan. “If you’re doing something valuable you should secure it. And with medical equipment it can be pretty heavy stakes,” Loomis adds.

Cyberattacks continue making headlines and iInternet of Things (IoT) devices have been a point of vulnerability — cybercrime damages are projected to cost the world US$6 trillion annually by 2021, according to Cybersecurity Ventures.

Yet, design security remains an afterthought, with many engineers believing that implementing security is expensive, difficult, and time-consuming, while others are leaving it up to software to protect their systems. Additionally, when secure ICs are used, some are compromised by sophisticated, direct, silicon-level attacks that are commonly launched in an attempt to obtain cryptographic keys and secured data from these integrated circuits (ICs).

The DS28E38 features Maxim’s ChipDNA physical unclonable function (PUF) technology, which the company claims makes it “immune to invasive attacks” because the ChipDNA-based root cryptographic key does not exist in memory or any other static state. Instead, Maxim’s PUF circuit relies on the naturally occurring random analogue characteristics of fundamental MOSFET (Metal-Oxide Semiconductor Field-Effect Transistor) semiconductor devices to produce cryptographic keys.

When needed, the circuit generates the key that is unique to the device, and which instantly disappears when it is no longer in use. If the DS28E38 were to come under an invasive physical attack, the attack would cause the sensitive electrical characteristics of the circuit to change, further impeding the breach.

“With Maxim’s ChipDNA PUF technology, the DS28E38 secure authenticator is highly effective and resistant against physical or black-box reverse engineering attacks,” says Michael Strizich, president of MicroNet Solutions Inc. “Even in a worst-case insider attack, the PUF-generated data is likely to remain protected due to the security features implemented by Maxim.”

In addition to the protection benefits, ChipDNA technology simplifies or eliminates the need for complicated secure IC key management as the key can be used directly for cryptographic operations. The ChipDNA circuit has also demonstrated high reliability over process, voltage, temperature, and ageing.

Additionally, to address cryptographic quality, PUF output evaluation to the NIST-based randomness test suite has been successful with pass results. Using the DS28E38, engineers can, from the start, build into their designs a hacking defence. The IC is said to be low-cost and simple to integrate into a customer’s design via Maxim’s single-contact 1-Wire® interface, combined with a low-complexity fixed-function command set including cryptographic operations.

“Designing in hardware-based security early on doesn’t require a lot of effort, resources, or time,” says Scott Jones, managing director of Embedded Security at Maxim Integrated. “With the ChipDNA technology-based DS28E38, designers can easily fortify their products with the highest level of protection. […]

The post ‘Unclonable’ security ICs with ChipDNA technology launched to protect IoT customers appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Flo Technologies Brings Smart Home Technology To Household Plumbing

By IoT – Internet of Things

Flo Technologies has today unveiled its Flo water security and management system, the world’s most advanced and intelligent home water monitoring and conservation solution, with the launch of a limited early adopter program. The sophisticated water monitoring and shut-off system is the first device on the market to sense water pressure, flow rate and temperature […]

The post Flo Technologies Brings Smart Home Technology To Household Plumbing appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

TUV Rheinland: Data Protection with IoT Data Privacy Certificates

By IoT – Internet of Things

TUV Rheinland’s Global Competence Center for IoT Privacy has announced a new package of services addressing the end-to-end data protection requirements in the rapidly growing Internet of Things (IoT) market. By providing first-of-a-kind protected privacy certificates, it is uniquely positioned with a differentiated set of capabilities. The solution is focused on providing a product and […]

The post TUV Rheinland: Data Protection with IoT Data Privacy Certificates appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

IoT Automation and the Modern Software Factory | @ThingsExpo #IoT #M2M #Automation

For what feels like a lifetime, industry analysts and experts have predicted ‘this’ will be the year the Internet of Things (IoT) finally takes off, both within industry and for consumers. But, as if by clockwork, each year of ‘guaranteed’ IoT explosion has passed – the mass adoption and exploitation of the Internet of Things having failed to materialize.

The IoT vision is almost utopian in its promise: a connected intelligent home or office, offering a simpler more convenient future. Our houses will be populated by smart ‘things’; thermostats that know precisely how warm we like our rooms, speakers that understand our taste in music and refrigerators that realize when we’re running low on the essentials and act to replenish themselves. Our cities too will become more intelligent; our highways will be populated by sensors, bringing congestion to an end; our streets safeguarded by interconnected surveillance. This will all be made possible via the communication of a global set of devices.

read more

Read more here:: iot.sys-con.com/index.rss

Italtel Introduces Open Innovation Program

By IoT – Internet of Things

Italtel, a leading telecommunications company in IT system integration, managed services, Network Functions Virtualization (NFV) and all-IP solutions, has launched an Open Innovation program, which will see it collaborate with start-ups and new businesses to leverage emerging technologies for applications such as Industry 4.0, Smart Cities and Digital Healthcare. The program will see Italtel […]

The post Italtel Introduces Open Innovation Program appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

IoT needs to be secured by the network

By Jon Gold

Everyone who has a stake in the internet of things, from device manufacturers to network service providers to implementers to customers themselves, makes important contributions to the security or lack thereof in enterprise IoT, attendees at Security of Things World were told.

“The key to all [IoT devices] is that they are networked,” Jamison Utter, senior business development manager at Palo Alto Networks told a group at the conference. “It’s not just a single thing sitting on the counter like my toaster, it participates with the network because it provides value back to business.”

“I think the media focuses a lot on consumer, because people reading their articles and watching the news … think about it, but they’re not thinking about the impact of the factory that built that consumer device, that has 10,000 or 20,000 robots and sensors that are all IoT and made this happen.”

To read this article in full, please click here

Read more here:: www.networkworld.com/category/lan-wan/index.rss

Consumers Want IoT Toys Regardless of Security, Survey Finds

As the holiday shopping season gets underway, many consumers will pick up new IoT devices, even though many of those devices might represent security risks.

Read more here:: www.eweek.com/rss.xml

[video] Integrating IoT Technology with Evatronix | @ThingsExpo #DX #IoT #M2M #Sensors

“Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don’t necessarily have the expertise, knowledge and design team to do so,” explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.

read more

Read more here:: iot.sys-con.com/index.rss

Smart Tech Healthcare 2018 Summit – Revolutionizing Indian Healthcare Sector Using Digital Technologies

By IoT – Internet of Things

Explore Exhibitions & Conference LLP, a leading business intelligence company is hosting the 2ndAnnual Smart Tech Healthcare 2018 Summit on 1st & 2nd Feb, 2018 at Bengaluru, India. Two days to discover and exchange, to build business to business partnerships in the healthcare sector, e-health and technology-enabled care services. The conference will bring together the […]

The post Smart Tech Healthcare 2018 Summit – Revolutionizing Indian Healthcare Sector Using Digital Technologies appeared first on IoT – Internet of Things.

Read more here:: iot.do/feed

What’s Keeping Deep Learning In Academia From Reaching Its Full Potential?

By Scott Clark

Deep learning is gaining a foothold in the enterprise as a way to improve the development and performance of critical business applications. It started to gain traction in companies optimizing advertising and recommendation systems, like Google, Yelp, and Baidu. But the space has seen a huge level of innovation over the past few years due to tools like open-source deep learning frameworks–like TensorFlow, MXNet, or Caffe 2–that democratize access to powerful deep learning techniques for companies of all sizes. Additionally, the rise of GPU-enabled cloud infrastructure on platforms like AWS and Azure has made it easier than ever for firms to build and scale these pipelines faster and cheaper than ever before.

Now, its use is extending to fields like financial services, oil and gas, and many other industries. Tractica, a market intelligence firm, predicts that deep learning enterprise software spending will surpass $40 billion worldwide by 2024. Companies that handle large amounts of data are tapping into deep learning to strengthen areas like machine perception, big data analytics, and the Internet of Things.

In the academic world outside of computer science from physics to public policy, though, where deep learning is rapidly being adopted and could be hugely beneficial, it’s often used in a way that leaves performance on the table.

Where academia falls short

Getting the most out of machine learning or deep learning frameworks requires optimization of the configuration parameters that govern these systems. These are the tunable parameters that need to be set before any learning actually takes place. Finding the right configurations can provide many orders of magnitude improvements in accuracy, performance or efficiency. Yet, the majority of professors and students who use deep learning outside of computer science, where these techniques are developed, are often using one of three traditional, suboptimal methods to tune, or optimize, the configuration parameters of these systems. They may use manual search–trying to optimize high-dimensional problems by hand or intuition via trial-and-error; grid search–building an exhaustive set of possible parameters and testing each one individually at great cost; or randomized search–the most effective in practice, but unfortunately the equivalent of trying to climb a mountain by jumping out of an airplane hoping you eventually land on the peak.

(gor kisselev/Shutterstock)

While these methods are easy to implement, they often fall short of the best possible solution and waste precious computational resources that are often scarce in academic settings. Experts often do not apply more advanced techniques because they are so orthogonal to the core research they are doing and the need to find, administer, and optimize more sophisticated optimization methods often wastes expert time. This challenge can also cause experts to rely on less powerful but easier to tune methods, and not even attempt deep learning. While researchers have used these methods for years, it’s not always the most effective way to conduct research.

The need for Bayesian Optimization

Bayesian optimization automatically fine tunes the parameters of these algorithms and machine learning models without accessing the underlying data or model itself. The process probes the underlying system to observe various outputs. It detects how previous configurations have performed to determine the best, most intelligent thing to try next. This helps researchers and domain experts arrive at the best possible model and frees up time to focus on more pressing parts of their research.

Bayesian optimization has already been applied outside of deep learning to other problems in academia from gravitational lensing to polymer synthesis to materials design and beyond. Additionally, a number of professors and students are already using this method at universities like MIT, University of Waterloo and Carnegie Mellon to optimize their deep learning models and conduct life-changing research. George Chen, assistant professor at Carnegie Mellon’s Heinz College of Public Policy and Information Systems, uses Bayesian Optimization to fine tune the machine learning models he uses in his experiments. His research consists of medical imaging analysis that automates the process of locating a specific organ in the human body. The implications of his research could help prevent unnecessary procedures in patients with congenital heart defects and others. Before applying Bayesian Optimization to his research, Chen had to guess and check the best parameters for his data models. Now, he’s able to automate the process and receive updates on his mobile phone so he can spend time completing other necessary parts of the research process.

Unfortunately, the vast majority of researchers leveraging deep learning outside of academia are not using these powerful techniques. This costs them time and resources or even completely prevents them from achieving their research goals via deep learning. When those experts are forced to do multidimensional, guess-and-check equations in their head, they usually have to spend valuable computational resources on modeling and work with sub-optimal results. Deploying Bayesian Optimization can accelerate the research process, free up time to focus on other important tasks and unlock better outcomes.

Scott Clark is the co-founder and CEO of SigOpt, which provides its services for free to academics around the world.. He has been applying optimal learning techniques in industry and academia for years, from bioinformatics to production advertising systems. Before SigOpt, Scott worked on the Ad Targeting team at Yelp leading the charge on academic research and outreach with projects like the Yelp Dataset Challenge and open sourcing MOE. Scott holds a PhD in Applied Mathematics and an MS in Computer Science from Cornell University and BS degrees in Mathematics, Physics, and Computational Physics from Oregon State University. Scott was chosen as one of Forbes’ 30 under 30 in 2016.

Related Items:

Getting Hyped for Deep Learning Configs

Dealing with Deep Learning’s Big Black Box Problem

Machine Learning, Deep Learning, and AI: What’s the Difference?

The post What’s Keeping Deep Learning In Academia From Reaching Its Full Potential? appeared first on Datanami.

Read more here:: www.datanami.com/feed/