ipv6 sky

Ditch Linux for Windows 10 on your Raspberry Pi with Microsoft’s IoT kit

While those interested in running Microsoft’s Windows 10 IoT Core—its free OS for hobbyist boards like the Raspberry Pi 2 and MinnowBoard Max—will likely have the chops to put together their own custom hardware configuration, the company wants to give newbies a helping hand. Microsoft has partnered with Adafruit to release the Windows IoT Core Starter Kit, which gives users everything they need to get started with IoT development.

The $75 (~£50) kit comes comes complete with an SD card preloaded with Windows 10 IoT Core, a Raspberry Pi 2 case, full size 40-pin breadboard, miniature WiFi module, BMP280 environmental sensor, RGB colour sensor, eight channel 10-Bit ADC with SPI interface, and a whole host of different resistors and LEDs. Those who needed Raspberry Pi 2 can pick up a $114.95 (~£70) with one included. A full list of the included components is below.

  • 8GB class 10 SD/MicroSD Memory Card w/ Windows 10 IOT Core
  • Adafruit Raspberry Pi B+ Case
  • Full Size Breadboard
  • Premium Male/Male Jumper Wires
  • Premium Female/Male ‘Extension’ Jumper Wires
  • Miniature WiFi Module
  • 5V 2A Switching Power Supply
  • Assembled Adafruit BMP280 Temperature & Humidity sensor
  • Assembled TCS34725 RGB Color Sensor
  • MCP3008 – 8 Channel 10-Bit ADC With SPI Interface
  • 1x Photo Cell
  • 2x Breadboard Trim Potentiometer
  • 5x 10K 5% 1/4W Resistor
  • 5x 560 ohm 5% 1/4W Resistor
  • 1x Diffused 10mm Blue LED
  • 1x Electrolytic Capacitor – 1.0uF
  • 1x Diffused 10mm Red LED
  • 1x Diffused 10mm Green LED
  • 3x 12mm Tactile Switches

Microsoft is hoping that kit, along with some free sample code, will encourage users to ditch Linux on their IoT projects in favour of Windows 10. While that’s a big ask, the company has been heavily courting the hobbyist community of late. Earlier this year, Microsoft revealed that it was bringing Windows 10 to the ever popular Arduino microcontroller boards, starting with the release of two open source libraries that connect Arduinos to Windows 10 devices.

Read 1 remaining paragraphs | Comments

Read more here:: IPv6 News Aggregator

Microsoft, Adafruit Partner on Windows IoT Core Starter Kit

The companies offer a Raspberry Pi 2-based bundle to help enthusiasts and Internet of things developers get up and running fast.

Read more here:: IPv6 News Aggregator

Windows 10 IoT Core Starter Pack announced, and new IoT build released for Insiders

In August, Microsoft officially released Windows 10 IoT Core, with support for the Raspberry Pi 2 and Intel’s MinnowBoard Max boards

Read More

Read more here:: IPv6 News Aggregator

No IoT ‘Killer App’ for consumers but a giant opportunity for Industry 4.0, Summit hears

When public speaking, start with a bang they say. It’s advice that clearly wasn’t lost on Ben Salama, managing director and Global Connected Operations lead at Accenture Mobility, writes Jeremy Cowan. #M2MSUMMIT15 – DAY 2 REVIEW: Dusseldorf, Germany. September 9, 2015 — Salama doesn’t believe there is a ‘Killer App’ for consumers in the Internet […]

The post No IoT ‘Killer App’ for consumers but a giant opportunity for Industry 4.0, Summit hears appeared first on M2M Now – News and expert opinions on the M2M industry, machine to machine magazine.

Read more here:: IPv6 News Aggregator

Starter Kit now available for Windows 10 IoT Core and the Raspberry Pi 2

Microsoft is partnering with Adafruit to release a new Starter Kit designed to get you going quickly and easily on your path of learning either electronics or Windows 10 IoT (Internet of Things) Core and the Raspberry Pi 2, writes Steve Teixeira on the Building Apps for Windows blog. This kit is available now and includes a compatible set of sensors, electronic parts, wires, and cables that have been verified to work with Windows 10 IoT Core, he says.

Read More

Read more here:: IPv6 News Aggregator

Ten Years of Secure DNS at .se! (What We Learned)

By Anne-Marie Eklund Löwinder

Ten years ago today, and with 300,000 domains in the zone file, we introduced DNSSEC at .se. It was the end of a fairly long journey, or at least the first stage. The first Swedish workshop to test the new function according to the specifications from the Internet Engineering Task Force was arranged in 1999. At that time, I was still working in the IT Commission’s Secretariat, and the standard was far from complete as it turned out. Our ambition was to change the world, at least the world that exists on the internet.

(This is a translated blog post. You can find the Swedish version here.)

* * *

False DNS information creates the risk of leading email traffic to an undesired location, to steal information or disturb the transaction. For example, if a user wants a specific website, false DNS information makes the user go to a different website that can have false content or fraudulently entice the user to give sensitive information by representing someone other than the bank, tax authority, social insurance, or anything else. The DNS world knew this earlier but it took many years to produce an antidote. The answer to the question turned out to be DNS Security Extensions (DNSSEC), which makes it possible to detect the use of manipulated and falsified information from name servers through the use of digital signatures.

The fact that we were first that made it, we also had to invent all the wheels on our own. Well aware that we would be seen as role models for the rest of the world, we were very careful to document and make the transition in a controlled manner. Starting September 13th, we delivered a signed zone file to one operator each day. The idea behind this gradual process was that we would be able to handle any problems without time pressure. During a short transition period, NIC-SE, which we called it at the time, distributed both an unsigned and a signed version of the .se zone from our distribution points. Everything went through without problems and on September 16th we were finished with the transition.

The Swedish top-level domain was the first top-level domain in the world to introduce DNSSEC. However, it was not enough that the .se zone could handle DNSSEC. Name servers for underlying domains and the users name server that handled the lookup of names to the IP addresses, the so-called resolver, had to also be able to handle the technology. Through the years, we have had many efforts to persuade them to do this.

The supervising authority PTS has been very supportive of the work and they decided early on to test how hard it would be to implement DNSSEC. The result of tests carried out showed that the introduction of DNSSEC was generally simple to implement for name server operators. What was missing was automated and standardized tools for key generating and zone signing, which were important for the use of DNSSEC to take off and that the increased manual work would not negate the increased security that DNSSEC otherwise brought.

So here we are ten years later and the .se zone has grown to 1,257,830 domains of which 585,088 are signed. Most internet operators in Sweden validate answers signed with DNSSEC. DNSSEC is still the way to go to achieve increased trust in the DNS service and thus the internet. Today there are 895 top-level domains signed with DNSSEC.

Expensive? No.

There is free software for signing. An upgrade of one’s own IT environment must be done sometimes anyway. Take the opportunity when it happens and it won’t be so burdensome. There is well spread DNS software that supports validation. It takes only slightly more hardware and does not require much additional care.

Difficult? Nah.

It’s not hard to start signing, but it does require a little order in the IT environment, of course. It is also not difficult to start validating, but it requires more knowledge to understand and debug. With DNSSEC, troubleshooting is more difficult than traditional DNS.

What have we learned?

That we poked at an anthill and eventually by every means of persuasion, we got more and more registrars to sign customer domains revealing weaknesses and shortcomings in the available software, something that benefits everyone.

We learned that the carrot must be fairly juicy to attract registrars, so we introduced the opportunity to receive compensation for every registered domain provided they responded properly to DNS queries.

Our work with DNSSEC has given us experience — both internal and for others who work with DNSSEC; developers, registers, registrars and internet operators.

Are we done yet?

Signing of DNS with DNSSEC is only the beginning. Something we quickly found was that the signing of the domain name system with DNSSEC created a great distribution channel for other security attributes. One example is the recently accepted standard Domain-based Authentication of Named Entities (DANE).

To blindly trust a large number of Certificate Authorities (CA) as we do today because they are pre-installed in, for example, browsers is stupid. A CA, who suffered an infringement or is simply an evil person, can issue certificates for any domain. We have seen many examples that have occurred over the past 2-3 years. DANE makes it possible for domain administrators to certify the keys used in the domain’s TLS clients and servers through storing them in DNS. This also allows DANE domain holders to specify which CA is allowed to issue certificates for any domain.

There is still a need for work to convince more registrars, for still not all registrars have signed either their own or their clients’ zones. There continues to be a need to work to convince more domain holders, above all those representing important society functions.

We thought earlier, perhaps naively, that all important functions in society would find that DNSSEC was the way to go to protect their users and customers. We could not have been more wrong. If we take Swedish banks, they are not all on track. Most municipalities are signed, but of those who are signed, not everything has been done right when it comes to DNS operations. 65 of 217 government agencies have signed their domains also with mixed results.

Patience is a virtue. I have plenty of that commodity. I think that more people will discover the need for DNSSEC. If you want to know more about what is happening in the world, ISOC has much information. Please contact us if there is something you have missed or that you think IIS can contribute to when it comes to further development of a secure internet.

Written by Anne-Marie Eklund Löwinder, Head of Security at .SE

Follow CircleID on Twitter

More under: DNS Security

Read more here:: IPv6 News Aggregator

Thomas Cook’s Ex-CEO Harriet Green to Lead New IBM IoT Business Units

IBM has formed two new business units that will apply the company’s Big Data, analytics, and cognitive computing capabilities to the Internet of Things and education markets. The company appointed Harriet Green, former CEO of the Thomas Cook Group, to lead the new units.

The two IBM IoT units are part of the $3 billion investment initiative the company announced in March. The four-year spending program’s goal is to develop solutions that take advantage of cognitive computing, cloud data services, and developer tools geared for specific industries, all meant to help organizations address the Internet of Things.

IBM’s cognitive computing work encompasses artificial intelligence and machine learning as well as machine-human interaction. The company’s flagship group of technologies representing this work is Watson, which first appeared in public as a supercomputer that played Jeopardy against the TV game show’s past winners in 2010 and won.

IBM has been quickly productizing the technology behind Watson in a multitude of ways, from specialized solutions for industries, such as financial services or healthcare, to general-purpose Big Data analytics services delivered via public cloud.

Green was appointed CEO of Thomas Cook, the British travel industry giant, in 2012 but was ousted abruptly late last year, which caused a big drop in the company’s share price. She joined when the company’s shares were at one of their lowest points of the decade and was credited with turning the ailing company around.

VP and general manager, Green will be responsible for developing the new IBM IoT and Education business units. The company plans to grow her team to more than 2,000 consultants, researchers, and developers, IBM said in a statement.

IBM has been involved in coalitions to encourage interoperability between IoT technologies. Earlier this month, it announced an alliance with the processor company ARM to make its IoT products and services compatible with ARM’s mbed operating system.

Last year, together with AT&T, Cisco, GE, and Intel, IBM formed the Industrial Internet Consortium to define open interoperability standards and common architectures for interconnecting “devices, machines, people, processes, and data.”

Read more here:: IPv6 News Aggregator

TODAY! Watch ION Cape Town Live!


Starting today in about an hour, at 9:00am SAST (UTC+2) our ION Cape Town event will be streaming live out of Cape Town, South Africa. We’ll be sharing the very latest news about IPv6, DNSSEC, DANE, TLS, Best Current Operational Practice (BCOP) efforts, and standards within the IETF.

You can watch the event live. Below is the full ION Cape Town agenda and the presentations will be uploaded here as they come in.

I hope you can join us, either here in Cape Town at the lovely Spier Hotel, or online!

9:00 AM

Opening Remarks

Megan Kruse, Internet Society

9:15 AM

Welcome from the Internet Society South Africa Chapter

Alan Levin

9:30 AM

Welcome from the Internet Society South Africa-Gauteng Chapter

Gabriel Ramokotjo

9:45 AM

Collaborative Security: Routing Resilience Manifesto and MANRS

Andrei Robachevsky, Internet Society

The Routing Resilience Manifesto initiative, underpinned by the “Mutually Agreed Norms for Routing Security (MANRS)” document that includes a set of actionable recommendations, aims to help network operators around the world work together to improve the security and resilience of the global routing system. In this session, we’ll explain the basic principles outlined in MANRS, how to sign up and support the effort, and how to get involved in helping to further increase global routing security.

10:00 AM

Why Implement DNSSEC?

Simon Balthazar, TZNIC

DNSSEC helps prevent attackers from subverting and modifying DNS messages and sending users to wrong (and potentially malicious) sites. So what needs to be done for DNSSEC to be deployed on a large scale? We’ll discuss the business reasons for, and financial implications of, deploying DNSSEC, from staying ahead of the technological curve, to staying ahead of your competition, to keeping your customers satisfied and secure on the Internet. We’ll also examine some of the challenges operators have faced and the opportunities to address those challenges and move deployment forward.

10:30 AM


11:00 AM

Deploying DNSSEC: A Case Study

Mark Elkins, Posix Systems – (South) Africa

This session will explore one organization’s technical solution for deploying DNSSEC support within its country code Top Level Domain (ccTLD). With a goal of making it easier for domain name holders to easily add DNSSEC, we will take a quick look at the DNSSEC implementation strategy, the status/progress of signed domains, and lessons learned and challenges for increasing the percentage of signed domain names.

11:30 AM

DANE: The Future of Transport Layer Security (TLS)

Michuki Mwangi, Internet Society

If you connect to a “secure” server using TLS/SSL (such as a web server, email server or xmpp server), how do you know you are using the correct certificate? With DNSSEC now being deployed, a new protocol has emerged called “DANE” (“DNS-Based Authentication of Named Entities“), which allows you to securely specify exactly which TLS/SSL certificate an application should use to connect to your site. DANE has great potential to make the Internet much more secure by marrying the strong integrity protection of DNSSEC with the confidentiality of SSL/TLS certificates. In this session, we will explain how DANE works and how you can use it to secure your websites, email, XMPP, VoIP, and other web services.

12:00 PM

DANE/DNSSEC/TLS Testing in the Go6lab

Jan Žorž, Internet Society

Jan Zorz set up DNSSEC, DANE, and TLS in his go6lab and then tested the implementations in the top one million Alexa domains. Jan will share his experiences deploying, testing, and evaluating DNSSEC, DANE, and TLS in his own lab and explain the process he used.

12:30 PM


1:30 PM

What’s Happening at the IETF? Internet Standards and How to Get Involved

Andrei Robachevsky, Internet Society

What’s happening at the Internet Engineering Task Force (IETF)? What RFCs and Internet-Drafts are in progress related to IPv6, DNSSEC, Routing Security/Resiliency, and other key topics? We’ll give an overview of the ongoing discussions in several working groups and discuss the outcomes of recent Birds-of-a-Feather (BoF) sessions, and provide a preview of what to expect in future discussions.

1:45 PM

IETF, Operational Experience, and Africa

Michuki Mwangi, Internet Society

The Internet Society is working toward fostering a larger and more engaged network operator community around the IETF and protocol development work. Part of that work was a survey of network operators in 2014 and an Internet-Draft about its results. We’re also interested specifically in bringing more African engineers with operational experience into the IETF, and perhaps even bringing a physical IETF meeting to the continent of Africa within the next few years. We’ll outline some of our recent work and hope to make this an interactive session to learn from the local community how to encourage more IETF participation.

2:15 PM

Best Current Operational Practices – An Update

Jan Žorž, Internet Society

The Internet Engineering Task Force (IETF) standardizes the protocols and services that vendors implement and network operators are supposed to deploy and use. We believe there is an opportunity to better identify, capture, and promote best current operational practices emerging from various regional network operators’ groups. We believe sharing these documents across the globe would benefit the wider Internet community and help more operators deploy new technologies like IPv6 and DNSSEC faster and easier. Deploy360’s Jan Zorz will give an update on this progress, discuss the status of BCOP efforts across the world, and give an overview of some of the documents in the process so far.

2:30 PM

Applying the 4DX Methodology to IPv6 Deployment

Mukom Akong Tamon, AfriNIC

Three years ago on World IPv6 Launch, thousands of ISPs, home networking equipment manufacturers, and web companies around the world united to permanently enable IPv6 on their products and services. With the global IPv6 traffic at 5% and growing, Africa’s share of that IPv6 traffic is negligible. Yet IPv6 deployment is one of those initiatives that everyone agrees is merely important and will forever be at the mercy of other initiatives that are urgent. How do you change that? By using the proven methodology 4DX that will guarantee the IPv6 deployment project has a chance amidst the whirlwind of other urgent initiatives. This should certainly start moving up dial on IPv6 traffic in Africa.

3:00 PM


3:30 PM

IPv6 Success Stories– Network Operators Tell All!

Moderator: Nishal Gorbudhan (Packet Clearing House)

Panelists: Andrew Alston , Liquid Telecom; Graham Beneke; Ben Maddison, Workonline Communications (Pty) Ltd.; Mark Tinka, SeaCom

In this session, we invite network operators to share their IPv6 success stories and lessons learned along the way that can help other managers of networks deploy IPv6. How did they do it? What technical, organizational, and political challenges did they face? Attendees will gain vital insight as network operators lay out the stages for IPv6 implementation—creating the business case for management buy-in, initiating a planning process, flipping the switch, and, finally, gathering measurements and proving success.

4:30 PM

Closing Remarks

Megan Kruse, Internet Society

Read more here:: IPv6 News Aggregator

Retail spend on Internet of Things to reach US$2.5bn by 2020, says Juniper Research report

New data from Juniper Research has revealed that retailers seeking to capitalise on IoT (Internet of Things) technologies will spend an estimated US$2.5 billion in hardware and installation costs. That is nearly a fourfold increase over this year’s estimated $670 million spend. The hardware spend includes Bluetooth Beacons and RFID (radio frequency ID) tags. In […]

The post Retail spend on Internet of Things to reach US$2.5bn by 2020, says Juniper Research report appeared first on M2M Now – News and expert opinions on the M2M industry, machine to machine magazine.

Read more here:: IPv6 News Aggregator

Sync’ing the Internet of Things to the pace of the business

By rss@it-analysis.com (Rob Bamforth, Quocirca)

Rob Bamforth
By: Rob Bamforth, Principal Analyst, Quocirca
Posted: 2nd September 2015
Copyright Quocirca © 2015

I must be a fan of smart connected things—sitting here with 2 wrist wearable devices in a house equipped with thirteen wireless thermostats and an environmental (temp, humidity, co2) monitoring system. However, even with all this data collection, an Internet of Things (IoT) poster-child application that works out the lifestyles of those in the household and adapts the heating to suit would be a total WOMBAT (waste of money, brains and time).

Why? Systems engineering—frequency response and the feedback loop.

The house’s heating ‘system’ has much more lag time than the connected IT/IoT technology would expect. Thermal mass, trickle under floor heating and ventilation heat recovery systems mean a steady state heating system, not one optimised by high frequency energy trading algorithms. The monitoring is there for infrequent anomaly detection (and re-assurance), not minute by minute variation and endless adjustments.

The same concepts can be applied to business systems. Some are indeed high frequency, with tight feedback loops that can, with little or no damping or shock absorption, be both very flexible and highly volatile. For example, the Typhoon Eurofighter aircraft with its inherent instability can only be supported by masses of data being collected, analysed and fed back in real-time to make pin-point corrections to keep control. Another example is the vast connected banking and financial sector, where there is feedback, but with no over-arching central control the systems occasionally either do not respond quickly enough or go into a kind of destructive volatile resonance.

Most business systems are not this highly strung. However, there is still a frequency response, or measure of the outputs in response to inputs that characterise the dynamics of the ‘system’, i.e. the business processes. Getting to grips with this is key to understanding the impact of change or what happens when things go wrong. This means processes need to be well understood—measured and benchmarked.

In the ‘old days’, we might have called these “time and motion” studies; progress chasers with stopwatches and clipboards measuring the minutiae of activities of those working on a given task. A problem was that workers (often rightly) thought they were being individually slighted for any out of the ordinary changes or inefficiency in the process, when in reality other (unmeasured) things were often at fault. This approach did not necessarily measure the things that mattered, only things that were easy to measure—a constant failing of many benchmarking systems, even today.

Fast-forward to the 1990s and a similar approach tried to implement improvements through major upheavals under a pragmatic guise—business process re-engineering (BPR). A good idea in principal, especially to bring a closer relationship between resources such as IT and business process, but unfortunately many organisations ditched the engineering principals and took a more simplistic route by using BPR as a pretext to reduce staff numbers. BPR became synonymous with ‘downsizing’.

Through the IoT there is now an opportunity to pick up on some of the important BPR principles, especially those with respect to measurement, having suitable resources to support the process and monitoring for on-going continuous improvement (or unanticipated failures). With a more holistic approach to monitoring, organisations can properly understand the behaviour and frequency response of a system or process by capturing a large and varied number of measurements in real time, and then be able to analyse all the data and take steps to make improvements.

Which brings us to the feedback loop. The mistake that technologists often make is that since automating part of a process appears to make things a little more efficient, then fully automating it must make it completely efficient.

While automating and streamlining can help improve efficiency, they can also introduce risks if the automation is out of step with the behaviour of the system and its frequency response. This leads to wasting money on systems that do not have the ability to respond quickly or alternatively, destructive (resonant) behaviour in those that respond too fast.

It might seem cool and sexy to go after a futuristic strategy of fully automated systems, but the IoT has many practical tactical benefits by holding a digital mirror up to the real world and a good first step that many organisations would benefits from would be to use it for benchmarking, analysis and incremental improvements.

First published in www.computerweekly.com

Read more here:: IPv6 News Aggregator