(ipv6 and security) -ipv4

Big Data’s Relentless Pace Exposes Old Tensions and New Risks in the Enterprise

By News Aggregator

By Alex Woodie

Over the past two week, we’ve explored some of the difficulties that enterprises have experienced in trying to adopt the Hadoop stack of big data technologies. One area that demands further attention is how the rapid pace of development of open source data science technology in general, and the new business opportunities it unlocks, is simultaneously exposing old fault lines between business and IT while opening them to new risks.

Events like Cloudera and O’Reilly‘s recent Strata + Hadoop World conference and Hortonworks‘ upcoming DataWorks Summit 2017 are showcases for the burgeoning market for big data technology. While Hadoop itself may not be the center of gravity that it once was, there is no doubt that we’re in the midst of a booming marketplace for distributed computing technologies and data science techniques, and it’s not going to let up anytime soon.

The rapid pace of technological evolution has plusses and minuses. On the plus side, users are getting new technologies to play with all the time. Apache Spark has captured people’s imaginations, but already a replacement is on the horizon for those who think Spark is too slow. Enter Ray, a new technology that RISELab director Michael Jordan discussed during a keynote at last week’s Strata (and which we’ll cover here at Datanami).

Data scientists and developers are having a veritable field day with new software. Meanwhile, new hardware innovations from Intel, IBM, Nvidia, and ARM promise to unleash another round of disruptive innovation just in time for the IoT revolution.

This is a great time to be a data scientist or a big data developer. Like kids in a candy store with $100 to spend — and no parents to tell them what to do — it’s a technological dream come true in many respects.

Too Much, Too Fast?

And therein lies the rub: the kid in the candy store with eyes as big as dinner plates will invariably have a stomach ache of similar proportion.

“We’ve never seen technology change so rapidly,” says Bill Schmarzo, the chief technology officer of the big data practice at Dell EMC and the Dean of Big Data. “I don’t think we know what we’re doing with it yet.”

CIOs are struggling to keep up with the pace of change while retaining the order and organizational structure that their bosses demand, Schmarzo says. “They’ve got the hardest job in the world because the world around them has changed so dramatically from what they were used to,” he says. “Only the most agile and the most business-centric companies are the ones who are going to survive.”

How exactly we got to this point in business technology will be fodder for history books. Suffice it to say, the key driver today is the open source development method, which allows visionaries like Doug Cutting, Jay Kreps, Matei Zaharia and others to share their creations en masse, creating a ripple effect of faster and faster innovation cycles.

As you ogle this technological bounty that seemingly came out of nowhere, keep this key point in mind: All this awesome new open source big data technology was designed by developers for other developers to use.

This is perhaps the main reason why regular companies — the ones in non-tech fields like manufacturing and distribution and retail that are accustomed to buying their technology as shrink-wrapped products that are fully backed and supported by a vendor – are having so much difficulty using it effectively.

The partnership between business leaders and IT is a rocky one (kentoh/Shutterstock)

So, where are the software vendors? While many are working to create end-to-end applications that masks the complexity, many of the players in big data are hawking tools, such as libraries or frameworks that help developers become more productive. We’re not seeing mad rush of fully shrink-wrapped products in large part because software vendors are hesitant to get off the merry-go-round and plant a stake in the ground to make the tech palatable to Average Joe for fear of being left behind by what’s coming next.

The result is we have today’s culture of roll-your-own big data tech. Instead of buying big data applications, companies hire data scientists, analysts, and data engineers to stitch together various frameworks and use the open source tools to build one-off big data analytics products that are highly tailored to the needs of the business itself.

This is by far the most popular approach, although there are a few exceptions. We’re seeing Hortonworks building Hadoop bundles to solve specific tasks, like data warehousing, cybersecurity, and IoT, while Cloudera is going upstream and competing with the data science platform vendors with its new Data Science Workbench. But homegrown big data analytics is the norm today.

Don’t Lock Me In

While this open source approach works with enough time and money (and blood, sweat, and tears), it’s generally at odds with traditional IT organizations that value things like stability and predictability and 24/7 tech hotlines.

All this new big data technology sold under the “Hadoop” banner has run headlong into IT’s sensibility and organizational momentum, says Peter Wang, the CTO and co-founder of Continuum Analytics.

“One of the points of open source tools is to provide innovation to avoid vendor lock in, and then part of that innovation is agility,” he tells Datanami. “When new innovation comes out, you consume it. What enterprise IT has tended to do is once it deploys some of these open source things is it locks them down and makes them less agile.”

Some CIOs gravitated toward Hadoop because they didn’t want to go through a six-month data migration for some classic data warehouse, Wang says. “Now they’re finding that the IT teams make them go through the same [six-month] process for their Hadoop data lake,” he says.

That’s the source of some of the Hadoop pain enterprises are feeling. They were essentially expecting to get something for nothing with Hadoop and friends, which can be downloaded and used without paying any licensing fees. Even if they understood that it would require investing in people who had the skills to develop data applications using the new class of tools, they vastly underestimated the DevOps costs of creating it and operating it.

There is necessary complexity in big data, says Continuum Analytics CTO and co-founder Peter Wang

In the wider data science world, a central tenet holds that data scientists must be free to seek out and discover new data sources that are of value, and find new ways to extract additional value from existing sources. But even getting that level of agility is anathema to traditional IT’s approach, Wang says.

“All of data science is about being fast, both with the algorithms as well as new kinds of data sets and being able to explore ideas quickly and get them into production quickly,” Wang explains. “There’s a fundamental tension there.”

This tension surprised enterprises looking to adopt Hadoop, which in its raw Apache form, is largely unworkable for companies that just want to use the product, and not hire a team of developers to learn how to use it. Over the past few years, the Hadoop distributors have worked out the major kinks and filled in the functionally gaps and have something resembling a working platform. It wasn’t easy (don’t forget the battles fought over Hortonworks’ attempts to standardize the stack with its Open Data Platform Initiative), but today you can buy a functioning stack.

The problem is, just as Hadoop started to harden, the market shifted, and new technology emerged that wasn’t tied to Hadoop (although much of it was shipped in Hadoop distributions). Companies today are hearing about things like deep learning and wondering if they should be using Google‘s TensorFlow, which has no dependencies on Hadoop, although an organization may use it store the huge amount of training data they’re going to need to train the neural networks data scientists will build with TensorFlow.

Necessary Vs. Unnecessary Complexity

The complexity of big data tech will increase, Wang says. And while software vendors may eventually take all of the technology and deliver shrink-wrapped products that take the developer-like complexity out of using this technology, any company that wants to take advantage of the current data science movement will need to stiffen up, accept the daunting complexity level, and just try to make the most of it.

“People are going to have to hire very talented individuals who can draw from this giant pile of parts and build extremely vertically integrated, targeted apps or cloud services or whatever, and have to own, soup-to-nuts, the whole thing,” Wang says. “Before you could rely on Red Hat or Microsoft to provide you an operating system. You could get a database from some vendor or get a Java runtime and Java tooling from somebody else.

Complexity in big data can cause project failure, but it can also lead to technological flexibility (Sergey Nivens/Shutterstock)

“At the end of the day,” Wang says, “you now have six or seven layers of an enterprise software development stack, and then you hire some software developers to sprinkle some magic design pattern stuff and write some things, and you’ve got an app.”

Not all complexity is evil, according to Wang, who differentiates between necessary complexity and unnecessary complexity.

“There’s a central opportunity available in this space right now, and that essential opportunity is ultimately the oxygen that’s driving all these different kinds of innovation,” Wang says. “The insight that’s available with the data we have – that is the oxygen causing everything to catch fire.”

We’re experiencing a Gold Rush mentality at the moment in regards to data and the myriad of different ways organizations can monetize data or otherwise do something productive with it. If you can get over the complexity and get going with the data, you have the potential to shake up an industry and get rich in the process, which is ultimately what’s driving the boom.

“There’s a concept of the unreasonable effectiveness of data, where you just have a [big] ton of data in every category,” Wang says. “You don’t have to be really smart, but if you can get the right data and harness it and do some fairly standard thing with it, you are way ahead of the competition.”

Hedging Tech Dynamism

There is a lot of uncertainty around what technologies will emerge and become popular, and companies don’t want to make bad bets on losing tech. One must have the stomach to accept relentless technological change, which Hadoop creator Doug Cutting likened to Darwinian evolution through random digital mutations.

One hedge against technology irrelevancy is flexibility, and that’s generally what open source provides, Schmarzo says.

“We think we have the right architecture, but we really don’t know what will change,” he says. “So how do I give myself an architecture that gives me as much agility and flexibility as possible, so when things change I haven’t locked myself in?”

Adopting an open source platform allows you, theoretically, the most flexible environment, he says, even if it runs counter to the prevailing desire in organizations to rely on outside vendors for technology needs. Investing in open source also makes you more attractive to prospective data scientists who are eager to use the latest and greatest tools.

The tsunami of data and relentless pace of technological evolution threatens to leave tech executives all wet (Couperfield/Shutterstock)

“Our approach so far has been, on the data science side, to let them use every tool they want to do their exploration and discovery work,” Schmarzo says. “So if they come out of university with experience or R or Python, we let them use that.”

Organizations may want the best of all worlds, but they will be forced to make tradeoffs at some point. “There is no silver bullet. Everything’s a trade off in life,” Schmarzo says. “You’ve got to build on something. You’ve got to pick something.”

The key is to try and retain that flexibility as much as possible so you’re able to adapt to new opportunities that data provides. The fact that open source is both the source of the flexibility and the source of the complexity is something that technology leaders will simply have to deal with.

“The IT guys want everything locked down. Meanwhile the business opportunity is passing you by,” he adds. “I would hate to be a CIO today. It was easy when you had to buy SAP and Oracle [ERP systems]. You bought them and it took you 10 years to put the stupid things in but it didn’t matter because it’s going to last 20 years. Now we’re worried if it doesn’t go in in a couple of months because in two months, it may be obsolete.”

While there’s a risk in betting on the wrong big data technology, getting flummoxed by Hadoop, or making poor hiring decisions, the potential cost of not even trying is potentially even bigger.

“Enterprises really need to understand the business risks around that,” Wang says. “I think most of them are not cognizant yet of what that means. You’re going to tell your data scientists ‘No you can’t look at those five data sets together, just because.’ Because the CIO or the CDO making that decision or that call does not recognize the upside for them. There’s only risk.”

Related Items:

Hadoop Has Failed Us, Tech Experts Say

Hadoop at Strata: Not Exactly ‘Failure,’ But It Is Complicated

Anatomy of a Hadoop Project Failure

Cutting On Random Digital Mutations and Peak Hadoop

The post Big Data’s Relentless Pace Exposes Old Tensions and New Risks in the Enterprise appeared first on Datanami.

Read more here:: www.datanami.com/feed/

The post Big Data’s Relentless Pace Exposes Old Tensions and New Risks in the Enterprise appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Big Data’s Relentless Pace Exposes Old Tensions and New Risks in the Enterprise

By Alex Woodie

Over the past two week, we’ve explored some of the difficulties that enterprises have experienced in trying to adopt the Hadoop stack of big data technologies. One area that demands further attention is how the rapid pace of development of open source data science technology in general, and the new business opportunities it unlocks, is simultaneously exposing old fault lines between business and IT while opening them to new risks.

Events like Cloudera and O’Reilly‘s recent Strata + Hadoop World conference and Hortonworks‘ upcoming DataWorks Summit 2017 are showcases for the burgeoning market for big data technology. While Hadoop itself may not be the center of gravity that it once was, there is no doubt that we’re in the midst of a booming marketplace for distributed computing technologies and data science techniques, and it’s not going to let up anytime soon.

The rapid pace of technological evolution has plusses and minuses. On the plus side, users are getting new technologies to play with all the time. Apache Spark has captured people’s imaginations, but already a replacement is on the horizon for those who think Spark is too slow. Enter Ray, a new technology that RISELab director Michael Jordan discussed during a keynote at last week’s Strata (and which we’ll cover here at Datanami).

Data scientists and developers are having a veritable field day with new software. Meanwhile, new hardware innovations from Intel, IBM, Nvidia, and ARM promise to unleash another round of disruptive innovation just in time for the IoT revolution.

This is a great time to be a data scientist or a big data developer. Like kids in a candy store with $100 to spend — and no parents to tell them what to do — it’s a technological dream come true in many respects.

Too Much, Too Fast?

And therein lies the rub: the kid in the candy store with eyes as big as dinner plates will invariably have a stomach ache of similar proportion.

“We’ve never seen technology change so rapidly,” says Bill Schmarzo, the chief technology officer of the big data practice at Dell EMC and the Dean of Big Data. “I don’t think we know what we’re doing with it yet.”

CIOs are struggling to keep up with the pace of change while retaining the order and organizational structure that their bosses demand, Schmarzo says. “They’ve got the hardest job in the world because the world around them has changed so dramatically from what they were used to,” he says. “Only the most agile and the most business-centric companies are the ones who are going to survive.”

How exactly we got to this point in business technology will be fodder for history books. Suffice it to say, the key driver today is the open source development method, which allows visionaries like Doug Cutting, Jay Kreps, Matei Zaharia and others to share their creations en masse, creating a ripple effect of faster and faster innovation cycles.

As you ogle this technological bounty that seemingly came out of nowhere, keep this key point in mind: All this awesome new open source big data technology was designed by developers for other developers to use.

This is perhaps the main reason why regular companies — the ones in non-tech fields like manufacturing and distribution and retail that are accustomed to buying their technology as shrink-wrapped products that are fully backed and supported by a vendor – are having so much difficulty using it effectively.

The partnership between business leaders and IT is a rocky one (kentoh/Shutterstock)

So, where are the software vendors? While many are working to create end-to-end applications that masks the complexity, many of the players in big data are hawking tools, such as libraries or frameworks that help developers become more productive. We’re not seeing mad rush of fully shrink-wrapped products in large part because software vendors are hesitant to get off the merry-go-round and plant a stake in the ground to make the tech palatable to Average Joe for fear of being left behind by what’s coming next.

The result is we have today’s culture of roll-your-own big data tech. Instead of buying big data applications, companies hire data scientists, analysts, and data engineers to stitch together various frameworks and use the open source tools to build one-off big data analytics products that are highly tailored to the needs of the business itself.

This is by far the most popular approach, although there are a few exceptions. We’re seeing Hortonworks building Hadoop bundles to solve specific tasks, like data warehousing, cybersecurity, and IoT, while Cloudera is going upstream and competing with the data science platform vendors with its new Data Science Workbench. But homegrown big data analytics is the norm today.

Don’t Lock Me In

While this open source approach works with enough time and money (and blood, sweat, and tears), it’s generally at odds with traditional IT organizations that value things like stability and predictability and 24/7 tech hotlines.

All this new big data technology sold under the “Hadoop” banner has run headlong into IT’s sensibility and organizational momentum, says Peter Wang, the CTO and co-founder of Continuum Analytics.

“One of the points of open source tools is to provide innovation to avoid vendor lock in, and then part of that innovation is agility,” he tells Datanami. “When new innovation comes out, you consume it. What enterprise IT has tended to do is once it deploys some of these open source things is it locks them down and makes them less agile.”

Some CIOs gravitated toward Hadoop because they didn’t want to go through a six-month data migration for some classic data warehouse, Wang says. “Now they’re finding that the IT teams make them go through the same [six-month] process for their Hadoop data lake,” he says.

That’s the source of some of the Hadoop pain enterprises are feeling. They were essentially expecting to get something for nothing with Hadoop and friends, which can be downloaded and used without paying any licensing fees. Even if they understood that it would require investing in people who had the skills to develop data applications using the new class of tools, they vastly underestimated the DevOps costs of creating it and operating it.

There is necessary complexity in big data, says Continuum Analytics CTO and co-founder Peter Wang

In the wider data science world, a central tenet holds that data scientists must be free to seek out and discover new data sources that are of value, and find new ways to extract additional value from existing sources. But even getting that level of agility is anathema to traditional IT’s approach, Wang says.

“All of data science is about being fast, both with the algorithms as well as new kinds of data sets and being able to explore ideas quickly and get them into production quickly,” Wang explains. “There’s a fundamental tension there.”

This tension surprised enterprises looking to adopt Hadoop, which in its raw Apache form, is largely unworkable for companies that just want to use the product, and not hire a team of developers to learn how to use it. Over the past few years, the Hadoop distributors have worked out the major kinks and filled in the functionally gaps and have something resembling a working platform. It wasn’t easy (don’t forget the battles fought over Hortonworks’ attempts to standardize the stack with its Open Data Platform Initiative), but today you can buy a functioning stack.

The problem is, just as Hadoop started to harden, the market shifted, and new technology emerged that wasn’t tied to Hadoop (although much of it was shipped in Hadoop distributions). Companies today are hearing about things like deep learning and wondering if they should be using Google‘s TensorFlow, which has no dependencies on Hadoop, although an organization may use it store the huge amount of training data they’re going to need to train the neural networks data scientists will build with TensorFlow.

Necessary Vs. Unnecessary Complexity

The complexity of big data tech will increase, Wang says. And while software vendors may eventually take all of the technology and deliver shrink-wrapped products that take the developer-like complexity out of using this technology, any company that wants to take advantage of the current data science movement will need to stiffen up, accept the daunting complexity level, and just try to make the most of it.

“People are going to have to hire very talented individuals who can draw from this giant pile of parts and build extremely vertically integrated, targeted apps or cloud services or whatever, and have to own, soup-to-nuts, the whole thing,” Wang says. “Before you could rely on Red Hat or Microsoft to provide you an operating system. You could get a database from some vendor or get a Java runtime and Java tooling from somebody else.

Complexity in big data can cause project failure, but it can also lead to technological flexibility (Sergey Nivens/Shutterstock)

“At the end of the day,” Wang says, “you now have six or seven layers of an enterprise software development stack, and then you hire some software developers to sprinkle some magic design pattern stuff and write some things, and you’ve got an app.”

Not all complexity is evil, according to Wang, who differentiates between necessary complexity and unnecessary complexity.

“There’s a central opportunity available in this space right now, and that essential opportunity is ultimately the oxygen that’s driving all these different kinds of innovation,” Wang says. “The insight that’s available with the data we have – that is the oxygen causing everything to catch fire.”

We’re experiencing a Gold Rush mentality at the moment in regards to data and the myriad of different ways organizations can monetize data or otherwise do something productive with it. If you can get over the complexity and get going with the data, you have the potential to shake up an industry and get rich in the process, which is ultimately what’s driving the boom.

“There’s a concept of the unreasonable effectiveness of data, where you just have a [big] ton of data in every category,” Wang says. “You don’t have to be really smart, but if you can get the right data and harness it and do some fairly standard thing with it, you are way ahead of the competition.”

Hedging Tech Dynamism

There is a lot of uncertainty around what technologies will emerge and become popular, and companies don’t want to make bad bets on losing tech. One must have the stomach to accept relentless technological change, which Hadoop creator Doug Cutting likened to Darwinian evolution through random digital mutations.

One hedge against technology irrelevancy is flexibility, and that’s generally what open source provides, Schmarzo says.

“We think we have the right architecture, but we really don’t know what will change,” he says. “So how do I give myself an architecture that gives me as much agility and flexibility as possible, so when things change I haven’t locked myself in?”

Adopting an open source platform allows you, theoretically, the most flexible environment, he says, even if it runs counter to the prevailing desire in organizations to rely on outside vendors for technology needs. Investing in open source also makes you more attractive to prospective data scientists who are eager to use the latest and greatest tools.

The tsunami of data and relentless pace of technological evolution threatens to leave tech executives all wet (Couperfield/Shutterstock)

“Our approach so far has been, on the data science side, to let them use every tool they want to do their exploration and discovery work,” Schmarzo says. “So if they come out of university with experience or R or Python, we let them use that.”

Organizations may want the best of all worlds, but they will be forced to make tradeoffs at some point. “There is no silver bullet. Everything’s a trade off in life,” Schmarzo says. “You’ve got to build on something. You’ve got to pick something.”

The key is to try and retain that flexibility as much as possible so you’re able to adapt to new opportunities that data provides. The fact that open source is both the source of the flexibility and the source of the complexity is something that technology leaders will simply have to deal with.

“The IT guys want everything locked down. Meanwhile the business opportunity is passing you by,” he adds. “I would hate to be a CIO today. It was easy when you had to buy SAP and Oracle [ERP systems]. You bought them and it took you 10 years to put the stupid things in but it didn’t matter because it’s going to last 20 years. Now we’re worried if it doesn’t go in in a couple of months because in two months, it may be obsolete.”

While there’s a risk in betting on the wrong big data technology, getting flummoxed by Hadoop, or making poor hiring decisions, the potential cost of not even trying is potentially even bigger.

“Enterprises really need to understand the business risks around that,” Wang says. “I think most of them are not cognizant yet of what that means. You’re going to tell your data scientists ‘No you can’t look at those five data sets together, just because.’ Because the CIO or the CDO making that decision or that call does not recognize the upside for them. There’s only risk.”

Related Items:

Hadoop Has Failed Us, Tech Experts Say

Hadoop at Strata: Not Exactly ‘Failure,’ But It Is Complicated

Anatomy of a Hadoop Project Failure

Cutting On Random Digital Mutations and Peak Hadoop

The post Big Data’s Relentless Pace Exposes Old Tensions and New Risks in the Enterprise appeared first on Datanami.

Read more here:: www.datanami.com/feed/

How police unmasked suspect accused of sending seizure-inducing tweet

By David Kravets

(credit: zodman)

The man accused of sending a Newsweek writer a seizure-inducing tweet left behind a digital trail that the Dallas Police Department traced—beginning with the @jew_goldstein Twitter handle, leading to a burner mobile phone SIM card, and ending with an Apple iCloud account, according to federal court documents unsealed in the case.

Rivello with driver’s license. (credit: Court documents)

John Rayne Rivello was arrested Friday at his Maryland residence and is believed to be the nation’s first defendant accused of federal cyberstalking charges for allegedly victimizing an epileptic with a strobing, epileptogenic online image—in this instance a GIF sent via Twitter.

According to court documents, when Newsweek writer Kurt Eichenwald of Dallas, Texas opened his Twitter feed on December 15, he was met with a strobing message that read, “you deserve a seizure for your post.” Eichenwald, who has written that he has epilepsy, went into an eight-minute seizure where he lost control of his body functions and mental faculty. His wife found him, placed him on the floor, called 911, and took a picture of the offending tweet, according to court records.

Read 7 remaining paragraphs | Comments

Read more here:: feeds.arstechnica.com/arstechnica/index?format=xml

How police unmasked suspect accused of sending seizure-inducing tweet

By News Aggregator

By David Kravets

(credit: zodman)

The man accused of sending a Newsweek writer a seizure-inducing tweet left behind a digital trail that the Dallas Police Department traced—beginning with the @jew_goldstein Twitter handle, leading to a burner mobile phone SIM card, and ending with an Apple iCloud account, according to federal court documents unsealed in the case.

Rivello with driver’s license. (credit: Court documents)

John Rayne Rivello was arrested Friday at his Maryland residence and is believed to be the nation’s first defendant accused of federal cyberstalking charges for allegedly victimizing an epileptic with a strobing, epileptogenic online image—in this instance a GIF sent via Twitter.

According to court documents, when Newsweek writer Kurt Eichenwald of Dallas, Texas opened his Twitter feed on December 15, he was met with a strobing message that read, “you deserve a seizure for your post.” Eichenwald, who has written that he has epilepsy, went into an eight-minute seizure where he lost control of his body functions and mental faculty. His wife found him, placed him on the floor, called 911, and took a picture of the offending tweet, according to court records.

Read 7 remaining paragraphs | Comments

Read more here:: feeds.arstechnica.com/arstechnica/index?format=xml

The post How police unmasked suspect accused of sending seizure-inducing tweet appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Shedding Light on How Much Energy the Internet and ICTs Consume

By Michael Oghia

Ever since I published an essay exploring the relationship between climate change and the Internet, I have endeavored to bring this subject to the fore as often as possible (and in relevant fora and discussions) since the responsibility of creating a more sustainable world falls on all communities and stakeholder groups. It is particularly pressing now — at a time when international interest in curbing climate change is strengthening, while it is juxtaposed with the receding commitments of the United States government vis-à-vis climate change and the environment under the Trump administration, which was reflected in his first official budget proposal.

Such instances where I have highlighted this topic included advocating for more environmentally friendly practices, such as reducing energy use and/or transitioning to renewable energy sources like solar and wind, at the global Internet Governance Forum (IGF), which was held in Guadalajara, Mexico, in December 2016. The Dynamic Coalition on the Internet and Climate Change (DCICC), which was a focus of the aforementioned essay, submitted its annual report leading up to the IGF, and was represented at the Dynamic Coalition (DC) main session where we updated the IGF community about our work and progress made in 2016. I was able to facilitate two breakout sessions at the Internet Society (ISOC)-sponsored Collaborative Leadership Exchange (CLX) as well — one where we discussed the Sustainable Development Goals (SDGs), and another that focused solely on the Internet, information and communications technologies (ICTs), and the environment. The work has only just begun, however, and is continuing in earnest. For instance, I was appointed as the focal point for a European Dialogue on Internet Governance (EuroDIG) workshop examining digital pollution and the effects on the environment (such as electronic waste (e-waste) and energy consumption), and I am co-organizing the DCICC annual session at the 2017 WSIS Forum.

So far, most of the feedback I have received from individuals across the Internet governance community about raising this issue has been positive. I greatly appreciate the support that has been shown, and the relevance of maintaining this discussion was further reinforced by a World Health Organization (WHO) publication that was released earlier this month (March) regarding technology, e-waste, and the environment:

“The WHO also noted [in their Inheriting a Sustainable World: Atlas on Children’s Health and the Environment report [PDF] the importance of properly managing emerging environmental hazards like electronic and electrical waste. Without proper recycling, this can lead to children being exposed to dangerous toxins known to harm intellectual development and cause attention deficits, as well as more serious conditions like lung disease and cancer.”

With the proliferation of the Internet of Things (IoT), the dangers raised by the WHO’s report are even more pressing. Yet, e-waste is only one part of the problem. As more and more people come online, more devices are going to come online as well, which is going to further add the need for power consumption by the Internet and ICTs. This point was explicitly raised in a personal email exchange between Vint Cerf — one of the “fathers of the Internet” who co-invented TCP/IP — and I. We were discussing Google’s transition to fully renewable energy use for its data centers, and he posed two questions. After Vint gave me his consent to share the information from our exchange, I decided to publish it here as a follow-up to my October 2016 essay. The following was my substantial answer to his questions (which are listed below in bold). Also, for full disclosure, note that I often refer to Google as a case study because (1) Vint is vice president and chief Internet evangelist at Google, (2) his inquiry regarding Google’s data center efficiency is specifically what prompted the discussion, and (3) Google has been committed to reducing its carbon footprint for years as well as sharing that insight with other stakeholders, specifically in the private sector and technical community.

1. “Do you know whether the aggregate power requirements for the data centers exceed the power requirements for all the laptops, desktops, mobiles, tablets, home routers and Wi-Fi units, etc.?”

I do not have this information, but I can imagine it is a great deal when multiplied by the billions of devices that exist. I found two articles that list the wattage for various electronics (one from Daft Logic, the other from the American Council for the Energy Efficient Economy). I am not sure, though, if those numbers would reflect the various realities (and policy environments) of various non-U.S. electronics.

2. “What fraction of the power consumption does the Internet (and its access devices) take?”

I wish there was an easy number to cite, but unfortunately the numbers are constantly in flux — based on myriad factors taken into account during analysis as well as the number of devices and various optimizations to infrastructure like data centers (e.g., using renewable/green energy, using artificial intelligence (AI) to help increase efficiency, etc.). They often also do not take into account global numbers (as doing so would likely be much more difficult). Having said that, I found many sources that can help shed light on this question (while also shedding light on the first question he posed above):

To begin, the 2008 Global e-Sustainability initiative (GeSI) SMART2020 report, which examined how to enable the low carbon economy in the information age, indicated: “ICTs currently contribute 2 percent to 3 percent of global greenhouse gas (GHG) emissions.” To put this into perspective and even based on 2008 numbers, “If the Internet were a country, it would rank as the fifth-largest for energy consumption.” Note, however, that the 2015 GeSI Smarter2030 report stressed, “ICT emissions as a percentage of global emissions will decrease over time,” and the GeSI revised the percentage of total global carbon emissions predicted in their 2008 report “due to a range of investments companies in the sector have been making to reduce their emissions and to the expected improvements in the efficiency of ICT devices … [Therefore,] the ICT sector’s emissions ‘footprint’ is expected to decrease to 1.97 percent of global emissions by 2030, compared to 2.3 percent in 2020.”

Bear in mind as well that the numbers are constantly changing in terms of the environmental impact of the Internet. For instance, as reported in The Verge, Google “used some 4,402,836 megawatt-hours (MWh) of electricity in 2014 (equivalent to the amount of energy consumed by 366,903 U.S. households),” but that number is being offset by the amount of renewable energy and other innovations powering its infrastructure as well. Furthermore, according to CCCB Lab:

“The first thing that emerges after surveying various sources is that nobody knows for sure. In 2010, The Guardian came up with the figure of 300 million tons of [carbon dioxide (CO2)] per year, ‘as much as all the coal, oil and gas burned in Turkey or Poland in one year.’ A controversial article titled “Power, Pollution, and the Internet” in The New York Times put the figure at 30 billion watts of electricity in 2011, ‘roughly equivalent to the output of 30 nuclear power plants.’ And according to Gartner consultants, the Internet was responsible for 2 percent of global emissions in 2007, outstripping the carbon footprint of the aviation industry. A more recent study by the Melbourne, Austraila-based Centre for Energy-Efficient Telecommunications (CEET) estimated in 2013 that the telecommunications industry as a whole emits 830 million tons of carbon dioxide a year — [accounting for 1.5 percent to 2 percent of the world’s energy consumption] — and that the energy demands of the internet could double by 2020. Jon Koomey — [a research fellow at Stanford University’s Steyer-Taylor Center for Energy Policy who has been studying Internet energy effects since 2000 and identified a long-term trend in energy-efficiency of computing that has come to be known as Koomey’s Law] — estimates that the direct electricity use of all the elements that make up the Internet is probably around 10 percent of total electricity consumption, but he emphasizes that it is very difficult to calculate exact figures: ‘You can use a computer to play video games or write a text and not be online, and this energy use is often counted as part of the Internet even though it isn’t actually the case.'”

Additionally, in a 2015 article published in The Atlantic, the following data was purported:

“According to the U.S. Energy Information Administration, in 2012 global electricity consumption was 19,710 billion kilowatt-hours (kWh). Using Google’s estimate [of its data center’s energy use] and electricity-consumption data from the CIA World Factbook, they’re using about as much electricity annually as the entire country of Turkey. (Honestly, that number seems impossibly high considering that in 2011 Google disclosed that it used merely 260 million watts of power, at the time noted for being slightly more than the entire electricity consumption of Salt Lake City.) In its 2013 sustainability report, Facebook stated its data centers used 986 million kWh of electricity — around the same amount consumed by Burkina Faso in 2012 … The impact of data centers — really, of computation in general — isn’t something that really galvanizes the public, partly because that impact typically happens at a remove from everyday life. The average amount of power to charge a phone or a laptop is negligible, but the amount of power required to stream a video or use an app on either device invokes services from data centers distributed across the globe, each of which uses energy to perform various processes that travel through the network to the device. One study … estimated that a smartphone streaming an hour of video on a weekly basis uses more power annually than a new refrigerator” [emphasis mine].

Another perspective to consider is how growth affects the numbers. For example, after interviewing Dr. Mike Hazas, one of the researchers from Lancaster University’s School of Computing and Communications involved in a study that warned how “the rapid growth of remote digital sensors and devices connected to the Internet [and the IoT] has the potential to bring unprecedented and, in principle, almost unlimited rises in energy consumed by smart technologies,” the writer of this article shared the following data:

“The increase in data use has brought with it an associated rise in energy use, despite improvements in energy efficiencies. Current estimates suggest the Internet accounts for 5 percent of global electricity use but is growing faster, at 7 percent a year, than total global energy consumption at 3 percent. Some predictions claim information technologies could account for as much as 20 percent of total energy use by 2030.”

Conversely, in 2013, The Register reported: “The information and technology ecosystem now represents around 10 percent of the world’s electricity generation.” It based this data on an August 2013 report written by Digital Power Group (DPG) CEO Mark P. Mills titled The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power (disclaimer: it was sponsored by the American Coal Association, a pro-coal lobbying group). He wrote:

“Based on a mid-range estimate, the world’s [ICT] ecosystem uses about 1,500 terawatt-hours (TWh) of electricity annually, equal to all the electric generation of Japan and Germany combined — as much electricity as was used for global illumination in 1985. The ICT ecosystem now approaches 10 percent of world electricity generation. Or in other energy terms — the zettabyte era already uses about 50 percent more energy than global aviation … Hourly Internet traffic will soon exceed the annual traffic of the year 2000. And demand for data and bandwidth and the associated infrastructure are growing rapidly not just to enable new consumer products and video, but also to drive revolutions in everything from healthcare to cars, and from factories to farms. Historically, demand for bits has grown faster than the energy efficiency of using them. In order for worldwide ICT electric demand to merely double in a decade, unprecedented improvements in efficiency will be needed now” [emphasis theirs].

The Registry’s report also emphasized the following about power consumption regarding personal devices: “Reduced to personal terms, although charging up a single tablet or smartphone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year. And as the world continues to electrify, migrating towards one refrigerator per household, it also evolves towards several smartphones and equivalent per person” [emphasis theirs]. (A methodology note from The Register: “This example used publicly available data on the average power utilization of a telecom network, the cost of wireless network infrastructure, and the energy that goes into making a tablet, although it ignored the data centers the video is served out of, and tablet charging” (in other words — though Google has purported that the cost of a Google search is 0.0003 kilowatt-hours (kWh) of energy — the likely cost is higher due to the power cost lurking in the non-Google systems used to deliver the data and perform the search). “[Furthermore,] the report’s figure reflects not just the cost of data centers — according to a 2007 report by the Environmental Protection Agency (EPA), U.S. data centers consumed 1.5 percent of U.S. electricity production, and was projected to rise to 3 percent by 2011 — but also the power involved in fabbing chips and the power consumption of digital devices and the networks they hang off).”

It is important to highlight, however, that regarding the stated fact that direct electricity use of the Internet is probably around 10 percent of total electricity consumption, Koomey said the same thing during his keynote address at Google’s How Green is the Internet Summit in June 2013, but he immediately added that “the number does not tell us very much” (source). His words were further reinforced by the slides he presented at the event. On slide 7, he shared a graph based on data from a 2013 study using information collected for Sweden in circa 2010 that showed annual electricity use (GWh/year) across various technological devices. It showed that user PCs accounted for approximately 1,800 GWh/year compared to the second-most energy consuming devices: data centers and third-party local-area networks (LANs), which were responsible for close to 1,300 GWh/year). Other user equipment accounted for around 700 GWh/year, while the lowest-ranked technology, Internet Protocol (IP) core network was responsible for around 250 GWh/year. But whether this trend has been sustained from 2010, though, is unclear. (The Google event itself was bolstered by a blog post that was written that same month, which corresponded with the release of a report by the Lawrence Berkeley National Laboratory (Berkeley Lab) titled The Energy Efficiency Potential of Cloud-based Software: A U.S. Case Study. It showed that “migrating all U.S. office workers to the cloud could save up to 87 percent of information technology (IT) energy use — about 23 billion kilowatt-hours (KWh) of electricity annually, or enough to power the city of Los Angeles for a year” (Berkeley Lab also made their model publically available “so other researchers and experts can plug in their own assumptions and help refine and improve the results.” Bear in mind that, ultimately, the goal in this case was not to emphasize the effects of personal electronics, but energy efficiency and management overall of larger technical infrastructure).

There is also information available from a 2013 Time article that directly addresses some of the specifics regarding Vint’s second question and criticizes Mills’ study:

“It’s important to note that the amount of energy used by any smartphone will vary widely depending on how much wireless data the device is using, as well as the amount of power consumed in making those wireless connections — estimates for which vary. The above examples assume a relatively heavy use of 1.58 GB a month — a figure taken from a survey of Verizon iPhone users last year. That accounts for the high-end estimate of the total power the phone would be consuming over the course of a year. NPD Connected Intelligence, by contrast, estimates that the average smartphone is using about 1 gigabyte (GB) of cellular data a month, and in the same survey that reported high data use from Verizon iPhone users, T-Mobile iPhone users reported just 0.19 GB of data use a month — though that’s much lower than any other service. Beyond the amount of wireless data being streamed, total energy consumption also depends on estimates of how much energy is consumed per GB of data. The top example assumes that every GB burns through 19 kilowatts (kW) of electricity. That would be close to a worst-case model. The CEET assumes a much lower estimate of 2 kWh per GB of wireless data, which would lead to a much lower electricity consumption estimate as well — as little as 4.6 kWh a year with the low T-Mobile data use. In the original version of the post, I should have noted that there is a significant range in estimates of power use by wireless networks, and that this study goes with the very high end.”

A note on the calculations on smartphone energy use: this comes from an email by Max Luke, a policy associate at the Breakthrough Institute, which posted about Mills’ study. He wrote:

“Last year [in 2012], the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by ATKearney for the mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kW X 19 GB) 361 kWh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. The EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.”

The Time article continued: “Breakthrough ran the numbers on the iPhone specifically — Mills’ endnotes (see page 44 in the report) refer to smartphones and tablets more generally — but Luke notes that Mills confirmed the calculations. These estimates are at the very high end — other researchers have argued that power use by smartphones is much lower. And the Mills study itself has come in for strong criticism from other experts.”

As this Forbes article noted:

“[Koomey said] he ‘spent years debunking’ Mills’ claims and published a paper in 2000 that directly contradicted his findings. Koomey [added] he was shocked to see Mills ‘rehashing’ his ideas now. ‘If he is making this claim again, that would be just crazy, outrageous,’ Koomey said. ‘What we found in 2000 is that a refrigerator used 2,000 times more electricity than the networking electricity of a wireless Palm Pilot. He is not a credible source of information.’ [Moreover,] Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments [that Mills’ work was flawed]. Heiser said Mills’ work ‘seems blatantly wrong.’ He said Mills overestimates the amount of power used by a modern smartphone, in this case a Galaxy S III, by more than four times. ‘I’d have to have a quick look to see how they arrive at this figure, but it certainly looks like baloney to me,’ Heiser said.”

Quoting from the Time article, “Gang Zhou, an associate professor of computer science at the College of Williams and Mary, was less direct in attacking Mills’ claims, but nonetheless said his measurements for the power consumption of smartphones was at least ‘one or two magnitude’ higher than they should be. Nonetheless, Zhou added that the subject of data center electricity usage is an important issue and it ‘should raise concern.'”

Koomey also reinforced the aforementioned criticism. In a 2013 article titled “Jonathan Koomey: Stop worrying about IT power consumption,” the author of the article wrote:

“By 2010, for example, data centers accounted for approximately 1.3 percent of worldwide electricity use and 2 percent of U.S. electricity use, according to Koomey’s August 2011 paper, “Growth in Data Center Electricity Use, 2005 to 2010.” This amount is growing, certainly, but at a far slower rate than we previously imagined. Still, that article helped inspire an industry-wide interest in the nexus of technology and energy efficiency that might otherwise have taken years to develop. “It was the process of debunking those claims that led me to spend a lot more time on data center electricity use and also on the electricity use of all sorts of computing devices,’ Koomey recalled. As he dug into the numbers, he actually discovered that efficiency has been improving since the days of vacuum tubes, a thesis he explored in his ‘One Great Idea’ presentation at the 2012 VERGE conference in Washington, D.C. This is one thing making the explosion of mobile devices such as smartphones and tablet computers viable, along with the associated reductions in the power consumption associated with client computing devices. Consider that a desktop computer uses roughly 150 kWh to 200 kWh of electricity annually, compared with 50 to 70 kWh for a notebook PC, 12 kWh for a tablet or 2 kWh for a smartphone. It’s also a very important development for the so-called Internet of Things, the vast network of sensors emerging to support a huge array of applications related to green buildings, intelligent transportation systems and so on. Despite suggestions otherwise, these applications should have very little impact on overall IT power consumption.”

Conclusion

Based on the outdated and often contradictory information available, I would stress that the ultimate answer to Vint’s question is that, unfortunately, it is inconclusive. Even a follow-up question Vint posted about the merits of switching to LED lighting in offsetting the power consumption of ICTs was undermined by a New Republic story that argued (according to the aforementioned Time article):

“The greenest building in New York City [at the time] — the Bank of America Tower, which earned the Leadership in Energy and Environmental Design‘s (LEED) highest Platinum rating — was actually one of the city’s biggest energy hogs. Author Sam Roudman argued that all the skyscraper’s environmentally friendly add-ons — the waterless urinals, the daylight dimming controls, the rainwater harvesting — were outweighed by the fact that the building used ‘more energy per square foot than any comparably sized office building in Manhattan,’ consuming more than twice as much energy per square foot as the 80-year-old (though recently renovated) Empire State Building.”

What is not undermined, however, is my rationale for exploring this topic more within the Internet community. While the Internet and ICTs are not the main contributor to climate change (compared to, say, energy production in general), there are a few considerations to keep in mind:

1. The issue of energy needed for infrastructure such as data centers as well as electronic devices (regardless of size or scope) is essentially two sides to the same coin, but data center/server operators generally have much more centralized control over how such centers/servers are powered than end-users.

2. Private sector data centers are becoming more efficient and are increasingly run by renewable energy, but many Internet exchange points (IXPs), for instance, as well as other critical infrastructure and non-private sector structures (such as government servers) are not. (See, for example, the abovementioned Atlantic article: “But that’s leverage available to companies operating at the scale of Facebook and Google [to galvanize states to cut non-renewable/fossil fuel energy sources]. It’s not really something that smaller colocation services can pull off. Relative to the entire data-center industry — data centers run on university campuses, enterprise colocation providers, hospitals, government agencies, banks — companies like Facebook and Google are a pronounced, but still minor piece of the larger data-center landscape. Some smaller companies have been able to push for changes, but they tend to need one of the heavy-hitter companies to act as muscle first” [emphasis mine]).

3. As more people come online, more and more data will be generated — to the point where the amount of energy needed to power the infrastructure that supports such data could grow exponentially. As Mills’ report stressed:

“Future growth in electricity to power the global ICT ecosystem is anchored in just two variables: demand (how fast traffic grows) and supply (how fast technology efficiency improves). As costs keep plummeting, how fast do another billion people buy smartphones and join wireless broadband networks where they will use 1,000 times more data per person than they do today; how fast do another billion, or more, join the Internet at all; how fast do a trillion machines and devices join the Internet to fuel the information appetite of Big Data? Can engineers invent, and companies deploy, more efficient ICT hardware faster than data traffic grows?”

Addressing each of these points — and what the Internet governance community can do about it — is critical. Given the inconclusive nature of this article, it is better to err on the side of caution — that is, address concerns related to energy and the environment within our domain, especially when investing in infrastructure upgrades. For instance, Koomey argued, “For in-house data centers that are standard business facilities, there is a strong case from both a cost and environmental perspective for going to the cloud.”

This also involves sharing best practices, solutions, and working collaboratively to help make current infrastructure more efficient and sustainable as well as better plan for the future (which of course includes policy discussions) as well as examining our entire production process and incorporating a more circular economy. By extending this logic to ICTs, it also includes not merely infrastructure and processes governing the Internet, but also aspects of the information society such as wireless infrastructure (e.g., towers and routers), wired infrastructure (e.g., manufacturing and laying fiber (including underwater cable)), the recyclability and sustainability of Internet-connected devices (e.g., manufacturing processes, recycling, and resource acquisition), and where the materials for such devices will come from in order to help the next billion(s) get online.

Written by Michael Oghia, independent #netgov consultant & editor

Follow CircleID on Twitter

More under: Cloud Computing, Data Center, Internet Governance, Internet of Things, Policy & Regulation

Read more here:: feeds.circleid.com/cid_sections/blogs?format=xml

Shedding Light on How Much Energy the Internet and ICTs Consume

By News Aggregator

By Michael Oghia

Ever since I published an essay exploring the relationship between climate change and the Internet, I have endeavored to bring this subject to the fore as often as possible (and in relevant fora and discussions) since the responsibility of creating a more sustainable world falls on all communities and stakeholder groups. It is particularly pressing now — at a time when international interest in curbing climate change is strengthening, while it is juxtaposed with the receding commitments of the United States government vis-à-vis climate change and the environment under the Trump administration, which was reflected in his first official budget proposal.

Such instances where I have highlighted this topic included advocating for more environmentally friendly practices, such as reducing energy use and/or transitioning to renewable energy sources like solar and wind, at the global Internet Governance Forum (IGF), which was held in Guadalajara, Mexico, in December 2016. The Dynamic Coalition on the Internet and Climate Change (DCICC), which was a focus of the aforementioned essay, submitted its annual report leading up to the IGF, and was represented at the Dynamic Coalition (DC) main session where we updated the IGF community about our work and progress made in 2016. I was able to facilitate two breakout sessions at the Internet Society (ISOC)-sponsored Collaborative Leadership Exchange (CLX) as well — one where we discussed the Sustainable Development Goals (SDGs), and another that focused solely on the Internet, information and communications technologies (ICTs), and the environment. The work has only just begun, however, and is continuing in earnest. For instance, I was appointed as the focal point for a European Dialogue on Internet Governance (EuroDIG) workshop examining digital pollution and the effects on the environment (such as electronic waste (e-waste) and energy consumption), and I am co-organizing the DCICC annual session at the 2017 WSIS Forum.

So far, most of the feedback I have received from individuals across the Internet governance community about raising this issue has been positive. I greatly appreciate the support that has been shown, and the relevance of maintaining this discussion was further reinforced by a World Health Organization (WHO) publication that was released earlier this month (March) regarding technology, e-waste, and the environment:

“The WHO also noted [in their Inheriting a Sustainable World: Atlas on Children’s Health and the Environment report [PDF] the importance of properly managing emerging environmental hazards like electronic and electrical waste. Without proper recycling, this can lead to children being exposed to dangerous toxins known to harm intellectual development and cause attention deficits, as well as more serious conditions like lung disease and cancer.”

With the proliferation of the Internet of Things (IoT), the dangers raised by the WHO’s report are even more pressing. Yet, e-waste is only one part of the problem. As more and more people come online, more devices are going to come online as well, which is going to further add the need for power consumption by the Internet and ICTs. This point was explicitly raised in a personal email exchange between Vint Cerf — one of the “fathers of the Internet” who co-invented TCP/IP — and I. We were discussing Google’s transition to fully renewable energy use for its data centers, and he posed two questions. After Vint gave me his consent to share the information from our exchange, I decided to publish it here as a follow-up to my October 2016 essay. The following was my substantial answer to his questions (which are listed below in bold). Also, for full disclosure, note that I often refer to Google as a case study because (1) Vint is vice president and chief Internet evangelist at Google, (2) his inquiry regarding Google’s data center efficiency is specifically what prompted the discussion, and (3) Google has been committed to reducing its carbon footprint for years as well as sharing that insight with other stakeholders, specifically in the private sector and technical community.

1. “Do you know whether the aggregate power requirements for the data centers exceed the power requirements for all the laptops, desktops, mobiles, tablets, home routers and Wi-Fi units, etc.?”

I do not have this information, but I can imagine it is a great deal when multiplied by the billions of devices that exist. I found two articles that list the wattage for various electronics (one from Daft Logic, the other from the American Council for the Energy Efficient Economy). I am not sure, though, if those numbers would reflect the various realities (and policy environments) of various non-U.S. electronics.

2. “What fraction of the power consumption does the Internet (and its access devices) take?”

I wish there was an easy number to cite, but unfortunately the numbers are constantly in flux — based on myriad factors taken into account during analysis as well as the number of devices and various optimizations to infrastructure like data centers (e.g., using renewable/green energy, using artificial intelligence (AI) to help increase efficiency, etc.). They often also do not take into account global numbers (as doing so would likely be much more difficult). Having said that, I found many sources that can help shed light on this question (while also shedding light on the first question he posed above):

To begin, the 2008 Global e-Sustainability initiative (GeSI) SMART2020 report, which examined how to enable the low carbon economy in the information age, indicated: “ICTs currently contribute 2 percent to 3 percent of global greenhouse gas (GHG) emissions.” To put this into perspective and even based on 2008 numbers, “If the Internet were a country, it would rank as the fifth-largest for energy consumption.” Note, however, that the 2015 GeSI Smarter2030 report stressed, “ICT emissions as a percentage of global emissions will decrease over time,” and the GeSI revised the percentage of total global carbon emissions predicted in their 2008 report “due to a range of investments companies in the sector have been making to reduce their emissions and to the expected improvements in the efficiency of ICT devices … [Therefore,] the ICT sector’s emissions ‘footprint’ is expected to decrease to 1.97 percent of global emissions by 2030, compared to 2.3 percent in 2020.”

Bear in mind as well that the numbers are constantly changing in terms of the environmental impact of the Internet. For instance, as reported in The Verge, Google “used some 4,402,836 megawatt-hours (MWh) of electricity in 2014 (equivalent to the amount of energy consumed by 366,903 U.S. households),” but that number is being offset by the amount of renewable energy and other innovations powering its infrastructure as well. Furthermore, according to CCCB Lab:

“The first thing that emerges after surveying various sources is that nobody knows for sure. In 2010, The Guardian came up with the figure of 300 million tons of [carbon dioxide (CO2)] per year, ‘as much as all the coal, oil and gas burned in Turkey or Poland in one year.’ A controversial article titled “Power, Pollution, and the Internet” in The New York Times put the figure at 30 billion watts of electricity in 2011, ‘roughly equivalent to the output of 30 nuclear power plants.’ And according to Gartner consultants, the Internet was responsible for 2 percent of global emissions in 2007, outstripping the carbon footprint of the aviation industry. A more recent study by the Melbourne, Austraila-based Centre for Energy-Efficient Telecommunications (CEET) estimated in 2013 that the telecommunications industry as a whole emits 830 million tons of carbon dioxide a year — [accounting for 1.5 percent to 2 percent of the world’s energy consumption] — and that the energy demands of the internet could double by 2020. Jon Koomey — [a research fellow at Stanford University’s Steyer-Taylor Center for Energy Policy who has been studying Internet energy effects since 2000 and identified a long-term trend in energy-efficiency of computing that has come to be known as Koomey’s Law] — estimates that the direct electricity use of all the elements that make up the Internet is probably around 10 percent of total electricity consumption, but he emphasizes that it is very difficult to calculate exact figures: ‘You can use a computer to play video games or write a text and not be online, and this energy use is often counted as part of the Internet even though it isn’t actually the case.’”

Additionally, in a 2015 article published in The Atlantic, the following data was purported:

“According to the U.S. Energy Information Administration, in 2012 global electricity consumption was 19,710 billion kilowatt-hours (kWh). Using Google’s estimate [of its data center’s energy use] and electricity-consumption data from the CIA World Factbook, they’re using about as much electricity annually as the entire country of Turkey. (Honestly, that number seems impossibly high considering that in 2011 Google disclosed that it used merely 260 million watts of power, at the time noted for being slightly more than the entire electricity consumption of Salt Lake City.) In its 2013 sustainability report, Facebook stated its data centers used 986 million kWh of electricity — around the same amount consumed by Burkina Faso in 2012 … The impact of data centers — really, of computation in general — isn’t something that really galvanizes the public, partly because that impact typically happens at a remove from everyday life. The average amount of power to charge a phone or a laptop is negligible, but the amount of power required to stream a video or use an app on either device invokes services from data centers distributed across the globe, each of which uses energy to perform various processes that travel through the network to the device. One study … estimated that a smartphone streaming an hour of video on a weekly basis uses more power annually than a new refrigerator” [emphasis mine].

Another perspective to consider is how growth affects the numbers. For example, after interviewing Dr. Mike Hazas, one of the researchers from Lancaster University’s School of Computing and Communications involved in a study that warned how “the rapid growth of remote digital sensors and devices connected to the Internet [and the IoT] has the potential to bring unprecedented and, in principle, almost unlimited rises in energy consumed by smart technologies,” the writer of this article shared the following data:

“The increase in data use has brought with it an associated rise in energy use, despite improvements in energy efficiencies. Current estimates suggest the Internet accounts for 5 percent of global electricity use but is growing faster, at 7 percent a year, than total global energy consumption at 3 percent. Some predictions claim information technologies could account for as much as 20 percent of total energy use by 2030.”

Conversely, in 2013, The Register reported: “The information and technology ecosystem now represents around 10 percent of the world’s electricity generation.” It based this data on an August 2013 report written by Digital Power Group (DPG) CEO Mark P. Mills titled The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power (disclaimer: it was sponsored by the American Coal Association, a pro-coal lobbying group). He wrote:

“Based on a mid-range estimate, the world’s [ICT] ecosystem uses about 1,500 terawatt-hours (TWh) of electricity annually, equal to all the electric generation of Japan and Germany combined — as much electricity as was used for global illumination in 1985. The ICT ecosystem now approaches 10 percent of world electricity generation. Or in other energy terms — the zettabyte era already uses about 50 percent more energy than global aviation … Hourly Internet traffic will soon exceed the annual traffic of the year 2000. And demand for data and bandwidth and the associated infrastructure are growing rapidly not just to enable new consumer products and video, but also to drive revolutions in everything from healthcare to cars, and from factories to farms. Historically, demand for bits has grown faster than the energy efficiency of using them. In order for worldwide ICT electric demand to merely double in a decade, unprecedented improvements in efficiency will be needed now” [emphasis theirs].

The Registry’s report also emphasized the following about power consumption regarding personal devices: “Reduced to personal terms, although charging up a single tablet or smartphone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year. And as the world continues to electrify, migrating towards one refrigerator per household, it also evolves towards several smartphones and equivalent per person” [emphasis theirs]. (A methodology note from The Register: “This example used publicly available data on the average power utilization of a telecom network, the cost of wireless network infrastructure, and the energy that goes into making a tablet, although it ignored the data centers the video is served out of, and tablet charging” (in other words — though Google has purported that the cost of a Google search is 0.0003 kilowatt-hours (kWh) of energy — the likely cost is higher due to the power cost lurking in the non-Google systems used to deliver the data and perform the search). “[Furthermore,] the report’s figure reflects not just the cost of data centers — according to a 2007 report by the Environmental Protection Agency (EPA), U.S. data centers consumed 1.5 percent of U.S. electricity production, and was projected to rise to 3 percent by 2011 — but also the power involved in fabbing chips and the power consumption of digital devices and the networks they hang off).”

It is important to highlight, however, that regarding the stated fact that direct electricity use of the Internet is probably around 10 percent of total electricity consumption, Koomey said the same thing during his keynote address at Google’s How Green is the Internet Summit in June 2013, but he immediately added that “the number does not tell us very much” (source). His words were further reinforced by the slides he presented at the event. On slide 7, he shared a graph based on data from a 2013 study using information collected for Sweden in circa 2010 that showed annual electricity use (GWh/year) across various technological devices. It showed that user PCs accounted for approximately 1,800 GWh/year compared to the second-most energy consuming devices: data centers and third-party local-area networks (LANs), which were responsible for close to 1,300 GWh/year). Other user equipment accounted for around 700 GWh/year, while the lowest-ranked technology, Internet Protocol (IP) core network was responsible for around 250 GWh/year. But whether this trend has been sustained from 2010, though, is unclear. (The Google event itself was bolstered by a blog post that was written that same month, which corresponded with the release of a report by the Lawrence Berkeley National Laboratory (Berkeley Lab) titled The Energy Efficiency Potential of Cloud-based Software: A U.S. Case Study. It showed that “migrating all U.S. office workers to the cloud could save up to 87 percent of information technology (IT) energy use — about 23 billion kilowatt-hours (KWh) of electricity annually, or enough to power the city of Los Angeles for a year” (Berkeley Lab also made their model publically available “so other researchers and experts can plug in their own assumptions and help refine and improve the results.” Bear in mind that, ultimately, the goal in this case was not to emphasize the effects of personal electronics, but energy efficiency and management overall of larger technical infrastructure).

There is also information available from a 2013 Time article that directly addresses some of the specifics regarding Vint’s second question and criticizes Mills’ study:

“It’s important to note that the amount of energy used by any smartphone will vary widely depending on how much wireless data the device is using, as well as the amount of power consumed in making those wireless connections — estimates for which vary. The above examples assume a relatively heavy use of 1.58 GB a month — a figure taken from a survey of Verizon iPhone users last year. That accounts for the high-end estimate of the total power the phone would be consuming over the course of a year. NPD Connected Intelligence, by contrast, estimates that the average smartphone is using about 1 gigabyte (GB) of cellular data a month, and in the same survey that reported high data use from Verizon iPhone users, T-Mobile iPhone users reported just 0.19 GB of data use a month — though that’s much lower than any other service. Beyond the amount of wireless data being streamed, total energy consumption also depends on estimates of how much energy is consumed per GB of data. The top example assumes that every GB burns through 19 kilowatts (kW) of electricity. That would be close to a worst-case model. The CEET assumes a much lower estimate of 2 kWh per GB of wireless data, which would lead to a much lower electricity consumption estimate as well — as little as 4.6 kWh a year with the low T-Mobile data use. In the original version of the post, I should have noted that there is a significant range in estimates of power use by wireless networks, and that this study goes with the very high end.”

A note on the calculations on smartphone energy use: this comes from an email by Max Luke, a policy associate at the Breakthrough Institute, which posted about Mills’ study. He wrote:

“Last year [in 2012], the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by ATKearney for the mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kW X 19 GB) 361 kWh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. The EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.”

The Time article continued: “Breakthrough ran the numbers on the iPhone specifically — Mills’ endnotes (see page 44 in the report) refer to smartphones and tablets more generally — but Luke notes that Mills confirmed the calculations. These estimates are at the very high end — other researchers have argued that power use by smartphones is much lower. And the Mills study itself has come in for strong criticism from other experts.”

As this Forbes article noted:

“[Koomey said] he ‘spent years debunking’ Mills’ claims and published a paper in 2000 that directly contradicted his findings. Koomey [added] he was shocked to see Mills ‘rehashing’ his ideas now. ‘If he is making this claim again, that would be just crazy, outrageous,’ Koomey said. ‘What we found in 2000 is that a refrigerator used 2,000 times more electricity than the networking electricity of a wireless Palm Pilot. He is not a credible source of information.’ [Moreover,] Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments [that Mills’ work was flawed]. Heiser said Mills’ work ‘seems blatantly wrong.’ He said Mills overestimates the amount of power used by a modern smartphone, in this case a Galaxy S III, by more than four times. ‘I’d have to have a quick look to see how they arrive at this figure, but it certainly looks like baloney to me,’ Heiser said.”

Quoting from the Time article, “Gang Zhou, an associate professor of computer science at the College of Williams and Mary, was less direct in attacking Mills’ claims, but nonetheless said his measurements for the power consumption of smartphones was at least ‘one or two magnitude’ higher than they should be. Nonetheless, Zhou added that the subject of data center electricity usage is an important issue and it ‘should raise concern.’”

Koomey also reinforced the aforementioned criticism. In a 2013 article titled “Jonathan Koomey: Stop worrying about IT power consumption,” the author of the article wrote:

“By 2010, for example, data centers accounted for approximately 1.3 percent of worldwide electricity use and 2 percent of U.S. electricity use, according to Koomey’s August 2011 paper, “Growth in Data Center Electricity Use, 2005 to 2010.” This amount is growing, certainly, but at a far slower rate than we previously imagined. Still, that article helped inspire an industry-wide interest in the nexus of technology and energy efficiency that might otherwise have taken years to develop. “It was the process of debunking those claims that led me to spend a lot more time on data center electricity use and also on the electricity use of all sorts of computing devices,’ Koomey recalled. As he dug into the numbers, he actually discovered that efficiency has been improving since the days of vacuum tubes, a thesis he explored in his ‘One Great Idea’ presentation at the 2012 VERGE conference in Washington, D.C. This is one thing making the explosion of mobile devices such as smartphones and tablet computers viable, along with the associated reductions in the power consumption associated with client computing devices. Consider that a desktop computer uses roughly 150 kWh to 200 kWh of electricity annually, compared with 50 to 70 kWh for a notebook PC, 12 kWh for a tablet or 2 kWh for a smartphone. It’s also a very important development for the so-called Internet of Things, the vast network of sensors emerging to support a huge array of applications related to green buildings, intelligent transportation systems and so on. Despite suggestions otherwise, these applications should have very little impact on overall IT power consumption.”

Conclusion

Based on the outdated and often contradictory information available, I would stress that the ultimate answer to Vint’s question is that, unfortunately, it is inconclusive. Even a follow-up question Vint posted about the merits of switching to LED lighting in offsetting the power consumption of ICTs was undermined by a New Republic story that argued (according to the aforementioned Time article):

“The greenest building in New York City [at the time] — the Bank of America Tower, which earned the Leadership in Energy and Environmental Design‘s (LEED) highest Platinum rating — was actually one of the city’s biggest energy hogs. Author Sam Roudman argued that all the skyscraper’s environmentally friendly add-ons — the waterless urinals, the daylight dimming controls, the rainwater harvesting — were outweighed by the fact that the building used ‘more energy per square foot than any comparably sized office building in Manhattan,’ consuming more than twice as much energy per square foot as the 80-year-old (though recently renovated) Empire State Building.”

What is not undermined, however, is my rationale for exploring this topic more within the Internet community. While the Internet and ICTs are not the main contributor to climate change (compared to, say, energy production in general), there are a few considerations to keep in mind:

1. The issue of energy needed for infrastructure such as data centers as well as electronic devices (regardless of size or scope) is essentially two sides to the same coin, but data center/server operators generally have much more centralized control over how such centers/servers are powered than end-users.

2. Private sector data centers are becoming more efficient and are increasingly run by renewable energy, but many Internet exchange points (IXPs), for instance, as well as other critical infrastructure and non-private sector structures (such as government servers) are not. (See, for example, the abovementioned Atlantic article: “But that’s leverage available to companies operating at the scale of Facebook and Google [to galvanize states to cut non-renewable/fossil fuel energy sources]. It’s not really something that smaller colocation services can pull off. Relative to the entire data-center industry — data centers run on university campuses, enterprise colocation providers, hospitals, government agencies, banks — companies like Facebook and Google are a pronounced, but still minor piece of the larger data-center landscape. Some smaller companies have been able to push for changes, but they tend to need one of the heavy-hitter companies to act as muscle first” [emphasis mine]).

3. As more people come online, more and more data will be generated — to the point where the amount of energy needed to power the infrastructure that supports such data could grow exponentially. As Mills’ report stressed:

“Future growth in electricity to power the global ICT ecosystem is anchored in just two variables: demand (how fast traffic grows) and supply (how fast technology efficiency improves). As costs keep plummeting, how fast do another billion people buy smartphones and join wireless broadband networks where they will use 1,000 times more data per person than they do today; how fast do another billion, or more, join the Internet at all; how fast do a trillion machines and devices join the Internet to fuel the information appetite of Big Data? Can engineers invent, and companies deploy, more efficient ICT hardware faster than data traffic grows?”

Addressing each of these points — and what the Internet governance community can do about it — is critical. Given the inconclusive nature of this article, it is better to err on the side of caution — that is, address concerns related to energy and the environment within our domain, especially when investing in infrastructure upgrades. For instance, Koomey argued, “For in-house data centers that are standard business facilities, there is a strong case from both a cost and environmental perspective for going to the cloud.”

This also involves sharing best practices, solutions, and working collaboratively to help make current infrastructure more efficient and sustainable as well as better plan for the future (which of course includes policy discussions) as well as examining our entire production process and incorporating a more circular economy. By extending this logic to ICTs, it also includes not merely infrastructure and processes governing the Internet, but also aspects of the information society such as wireless infrastructure (e.g., towers and routers), wired infrastructure (e.g., manufacturing and laying fiber (including underwater cable)), the recyclability and sustainability of Internet-connected devices (e.g., manufacturing processes, recycling, and resource acquisition), and where the materials for such devices will come from in order to help the next billion(s) get online.

Written by Michael Oghia, independent #netgov consultant & editor

Follow CircleID on Twitter

More under: Cloud Computing, Data Center, Internet Governance, Internet of Things, Policy & Regulation

Read more here:: feeds.circleid.com/cid_sections/blogs?format=xml

The post Shedding Light on How Much Energy the Internet and ICTs Consume appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Homeland Security Invests $1M in five IoT Security Startups

The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) announced their $1M investment in five IoT-security startups. The startups being funded are Factom, Whitescope, M2Mi, Ionic Security, and Pulzze Systems.

DHS aims to improve situational awareness of security within the Internet of Things by funding these startups. The announcement was made on Jan 21, 2017.

The five IoT security startups, selected through its ‘Securing the Internet of Things and Silicon Valley Innovation Program’, will produce and demonstrate a pilot-ready prototype to qualify for the third phase of the program.

Major focus of each of funded startup is as follows:

Atlanta-based Ionic Security received approximately $200K to develop a distributed data protection model. It will solve authentication, detection and confidentiality challenges that impact distributed IoT devices. Inonic’s total total equity funding stands at $122.44M in 7 Rounds from 22 Investors. Amazon also participated in Ionic’s Series D $45M funding.

Factom Harmony

Austin-TX based Factom received $199K by DHS to deliver solutions related to quality control, due diligence, and auditing by leveraging the blockchain. It will help prevent spoofing and ensure data integrity. The Austin-based startup has also secured $6.49M in 5 Rounds from 4 Investors.

California-based M2Mi received $200K to deploy open source version of the SPECK cryptographic protocol. It will help run a light weight crypto package on IoT devices.

Another California-based startup Whitescope LLC received $200K to build a working prototype of a secure wireless communications gateway for IoT devices.

California-based Pulzze Systems will improve infrastructure visibility problem by providing dynamic detection as components connect or disconnect from a networked system. It also received $200K in funding by DHS.

Read more here:: feeds.feedburner.com/iot

Term Sheet — Wednesday, March 15

By Erin Griffith

THE VC BEHIND THE COSMETICS COUNTER

Good morning, Term Sheet readers. Today’s column is from Fortune editor Matt Heimer. Enjoy.

In my life to date, I’ve watched umpteen thousand hours of prime-time television and leafed my way through a metric ton of magazines. So I’ve always been at least dimly aware of L’Or?al–the world’s biggest cosmetics company, and one of its top three spenders on advertising. Not that I’m a customer: On the fashion and skin care front, I’m one of those guys for whom the pejorative “basic” was coined. But L’Or?al is part of my mental and cultural wallpaper, along with similarly huge consumer brands that I encounter every day but don’t patronize, like Nike or McDonald’s.

Until very recently, if you had asked me how L’Or?al got so big, I would have said something appropriately dense and male like, “I dunno, I guess they invented a bunch of different makeup and sold it?” If you feel the same way, your eyes will be opened as mine were when you read my colleague Erin Griffith’s interview with Jean-Paul Agon, L’Or?al’s CEO since 2006, from this month’s issue of Fortune magazine. (Erin, whom you know as your regular Term Sheet columnist, interviewed Agon just before she left for vacation.)

It turns out that L’Or?al didn’t grow its beauty empire organically–or at least, it hasn’t done so since the Mad Men era. As Agon told Erin, the company has been expanding mostly via M&A: “Our model is, and has been for 50 years, to buy a brand at an early stage that we think can become a global successful player.” If spotting incipient success stories before they take off sounds a lot like the work of a venture capitalist…well, you’re on to something.

Of course, that analogy goes only so far. Being bought by L’Or?al is less like getting seed money from a VC and more like being acquired by the General Motors of makeup. This is a 108-year-old company that operates in 140 countries, runs an enormous research arm and owns an extra-large filing cabinet full of patents. If it had to grow only organically, it would probably do just fine.

But Agon describes L’Or?al’s management culture as one that may have more in common with startups than with other Global 500 corporations. That culture involves flexibility and openness to new ideas and new technologies, a combination that Agon dubs “organized chaos.” And in a beauty market where tastes change rapidly, it makes more sense for L’Or?al to acquire promising trend-setters than to play catch-up to those trends with internal R&D.

L’Or?al’s own name brand is relatively traditional and even conservative by beauty industry standards; this is a company, after all, that used to be called Soci?t? Fran?aise de Teintures Inoffensives pour Cheveux (“Safe Hair Dye Company of France”). So its in-house marketing geniuses may not have been likely to come up on their own with, say, a brand concept called Urban Decay. But a roving eye for good ideas outside its own walls helped the company spot that very hot brand and target it as a worthy acquisition, in 2012. (L’Or?al’s robust free cash flow, well north of $3.5 billion in each of the past four years, makes pulling the trigger on such acquisitions much easier.)

Perhaps most encouraging of all for the leaders of beauty startups, L’Or?al presents itself as a company that won’t mess with your brainchild after adopting it. “We offer [acquisition targets] the total respect of the identity, culture, spirit and soul of the brand,” Agon told Fortune. Music to a founder’s ears.

What’s Agon’s textbook example of success on the brand-integrity front? Kiehl’s, the New York boutique skin care brand, which L’Or?al has owned since 2000. It turns out there’s a bottle of their lotion in my travel bag. (Trust me, the scent is masculine.) So I guess I’m a L’Or?al customer after all. Who’s basic now? – Matt Heimer

File this in the “whoops” folder: Yesterday’s Term Sheet incorrectly stated that the VC firm Rokk3r Fuel has closed its inaugural fund. It is seeking to raise $150 million; it has not yet raised that amount. Additionally, in the item on Spectrum Equity and Cressey & Co.’s growth investment in Verisys, the hyperlink was incorrect. (Here’s the correct one). Apologies!

THE LATEST FROM FORTUNE…

[ts_bullet_primary] For advertisers, Instagram > Snapchat.

[ts_bullet_primary] Ex-Zenefits CEO Parker Conrad just launched a new HR startup.

[ts_bullet_primary] Donald Trump’s tax return showed he paid more than people thought. But it also suggests he understated his salary by millions.

[ts_bullet_primary] Big Food reconsiders its relationship with sugar, fat, and salt.

[ts_bullet_primary] The health startup trying to take on the multibillion-dollar diet industry.

[ts_bullet_primary] Commentary: Airbnb is way more competitive than Uber.

[ts_bullet_primary] Clifton Leaf has been named editor-in-chief of Fortune.

…AND ELSEWHERE

Domino’s high-tech $9 billion pizza empire. The toxic trouble brewing at Thinx. Lucky Peach‘s days are numbered. What if the future of fashion is spider silk. New questions arise on the safety of a Monsanto weed killer.

VENTURE DEALS

[ts_bullet_primary] ServiceTitan, a Glendale, Calif. provider of business management software for plumbing and electrical service companies, raised $80 million in a Series B funding. ICONIQ Capital led the round.

[ts_bullet_primary] Visier, a San Jose, Calif.-based provider of workforce intelligence software, raised $45 million in Series D funding. Sorenson Capital led the round, and was joined by Foundation Capital, Summit Partners, and Adams Street Partners.

[ts_bullet_primary] Innovium, a San Jose, Calif. provider of networking silicon solutions for data centers, raised $38.3 million in Series C funding. Redline Capital led the round, and was joined by Greylock Partners, Walden Riverwood Ventures, Capricorn Investment Group, Qualcomm Ventures, and S-Cubed Capital.

[ts_bullet_primary] Evrythng, a New York-based IoT platform, raised $24.8 million in Series B funding. Sway Ventures led the round, and was joined by Generation Ventures and BLOC Ventures.

[ts_bullet_primary] .

[ts_bullet_primary] Evolv Technology, a Waltham, Mass. security platform, raised $18 million in Series B funding. Investors include Lux Capital, Bill Gates, General Catalyst, and DCVC.

[ts_bullet_primary] Infoworks, a San Jose, Calif. platform for end-to-end data warehousing, raised $15 million in Series B funding. Centerview Capital Technology led the round, and was joined by Nexus Venture Partners.

[ts_bullet_primary] Flow, a Hoboken, N.J. provider of a platform for cross-border commerce, raised $13 million in Series A funding from Bain Capital Ventures.

[ts_bullet_primary] Dyadic Security, an Israeli security software provider, raised $12 million in Series B funding. Goldman Sachs Principal Strategic Investments, Citi Ventures, and Innovation Endeavors led the round.

[ts_bullet_primary] LimeBike, a San Francisco bike-sharing network, raised $12 million in Series A funding. Andreessen Horowitz led the round, and was joined by Stanford University, IDG, and DCM.

[ts_bullet_primary] Bringg, an Israeli logistics platform for enterprises, raised $10 million in funding. Aleph VC led the round, and was joined by Coca-Cola (NYSE:KO) and Pereg Ventures.

[ts_bullet_primary] Goodlord, a London software provider for real estate agents and renters, raised ?7.2 million ($8.9 million) from Ribbit Capital, LocalGlobe, and Global Founders Capital.

[ts_bullet_primary] Liven, an Australian hospitality-tech startup, raised A$10 million ($7.6 million) in funding from an unnamed Melbourne-based venture capital firm.

HEALTH + LIFE SCIENCES DEALS

[ts_bullet_primary] CellAegis Devices, a Toronto medical device company, raised $ 9.5 million in Series C funding.

[ts_bullet_primary] NeuroOne, a Minnetonka, Minn. developer of neuromonitoring and neuromodulating products, raised $1.2 in seed funding from investors including FundRx.

PRIVATE EQUITY DEALS

[ts_bullet_primary] Bain Capital is close to acquiring .

[ts_bullet_primary] .

[ts_bullet_primary] Atlantic Street Capital acquired a stake in Planet Fit Indy 10, which operates Planet Fitness (NYSE:PLNT) health clubs in the greater Indianapolis area.

[ts_bullet_primary] Excelligence Learning Corporation, which is backed by Brentwood Associates, acquired ChildCare Education Institute, a Duluth, Ga. provider of online training and certificates for the early child care and education market. Financial terms weren’t disclosed.

[ts_bullet_primary] Cedar Springs Capital and Crestline Investors acquired a majority stake in CarePayment, a Nashville, Tenn.-based company that helps patients manage their health-care expenses.

[ts_bullet_primary] Shore Capital Partners formed and invested in EyeSouth Partners, a provider of support services to affiliated eye care practices.

OTHER DEALS

[ts_bullet_primary] Euronet Worldwide (Nasdaq:EEFT) offered $1 billion to acquire rival .

[ts_bullet_primary] Citrix Systems (Nasdaq:CTXS), which has a market cap of $13.2 billion, is considering strategic alternatives, such as putting itself up for sale. Read more at Fortune.

[ts_bullet_primary] Volkswagen (XTRA:VOW3), still struggling from the fall out of its emission scandal, indicated it could be open to a merger with Fiat Chrysler (BIT:FCA). Read more at Fortune.

[ts_bullet_primary] Neiman Marcus, a Dallas-based luxury fashion retailer, is putting itself up for sale. Read more at Fortune.

[ts_bullet_primary] American Securities agreed to acquire Air Methods Corporation (Nasdaq:AIRM) for an enterprise value of about $2.5 billion. At a price of $43 per Air Methods share, American Securities’ offer represents a 20.4% premium on the company’s stock price on January 31, 2017 prior to news regarding a sale.

[ts_bullet_primary] TechStyle Fashion Group, which owns the Kate Hudson-fronted .

[ts_bullet_primary] Harmony Merger (Nasdaq:HRMN), an acquisition company, agreed to merge with .

IPOS

[ts_bullet_primary] .

EXITS

[ts_bullet_primary] Active Interest Media, an El Segundo, Calif. Medica company backed by Wind Point Partners, sold Yachting Promotions, a Fort Lauderdale, Fla. operator of yachting and boat shows in the U.S., to Informa (LSE: INF.L).

FIRMS + FUNDS

[ts_bullet_primary] Bill Maris, the founder and ex-CEO of Google Ventures, is starting a venture fund after all, Bloomberg reports. After walking away from a $230 million health-focused fund late last year, Maris has decided to launch a new $100 million fund that will also focus on biotech and health-related companies. .

PEOPLE

[ts_bullet_primary] Mike Mogul joined Healthpoint Capital as a member of the firm’s leadership team. Mogul is the former CEO of DJO Global.

[ts_bullet_primary] Peter Coroneos joined Z Capital Group as a managing director and the global head of corporate development.

[ts_bullet_primary] B Capital Group announced a series of new hires: Kabir Narang joined the firm as an investment partner, Virginia Schmitt as chief financial officer and chief administrative officer, Chip Welsh and Dave Gallon as vice presidents, and Hailey Hu as a senior associate. In addition, the firm promoted Adam Seabrook from senior associate to principal.

SHARE TODAY’S TERM SHEET

Term Sheet is produced by Laura Entis. Submit deal items here. View this email in your browser.

Read more here:: fortune.com/tech/feed/

Term Sheet — Wednesday, March 15

By News Aggregator

By Erin Griffith

THE VC BEHIND THE COSMETICS COUNTER

Good morning, Term Sheet readers. Today’s column is from Fortune editor Matt Heimer. Enjoy.

In my life to date, I’ve watched umpteen thousand hours of prime-time television and leafed my way through a metric ton of magazines. So I’ve always been at least dimly aware of L’Or?al–the world’s biggest cosmetics company, and one of its top three spenders on advertising. Not that I’m a customer: On the fashion and skin care front, I’m one of those guys for whom the pejorative “basic” was coined. But L’Or?al is part of my mental and cultural wallpaper, along with similarly huge consumer brands that I encounter every day but don’t patronize, like Nike or McDonald’s.

Until very recently, if you had asked me how L’Or?al got so big, I would have said something appropriately dense and male like, “I dunno, I guess they invented a bunch of different makeup and sold it?” If you feel the same way, your eyes will be opened as mine were when you read my colleague Erin Griffith’s interview with Jean-Paul Agon, L’Or?al’s CEO since 2006, from this month’s issue of Fortune magazine. (Erin, whom you know as your regular Term Sheet columnist, interviewed Agon just before she left for vacation.)

It turns out that L’Or?al didn’t grow its beauty empire organically–or at least, it hasn’t done so since the Mad Men era. As Agon told Erin, the company has been expanding mostly via M&A: “Our model is, and has been for 50 years, to buy a brand at an early stage that we think can become a global successful player.” If spotting incipient success stories before they take off sounds a lot like the work of a venture capitalist…well, you’re on to something.

Of course, that analogy goes only so far. Being bought by L’Or?al is less like getting seed money from a VC and more like being acquired by the General Motors of makeup. This is a 108-year-old company that operates in 140 countries, runs an enormous research arm and owns an extra-large filing cabinet full of patents. If it had to grow only organically, it would probably do just fine.

But Agon describes L’Or?al’s management culture as one that may have more in common with startups than with other Global 500 corporations. That culture involves flexibility and openness to new ideas and new technologies, a combination that Agon dubs “organized chaos.” And in a beauty market where tastes change rapidly, it makes more sense for L’Or?al to acquire promising trend-setters than to play catch-up to those trends with internal R&D.

L’Or?al’s own name brand is relatively traditional and even conservative by beauty industry standards; this is a company, after all, that used to be called Soci?t? Fran?aise de Teintures Inoffensives pour Cheveux (“Safe Hair Dye Company of France”). So its in-house marketing geniuses may not have been likely to come up on their own with, say, a brand concept called Urban Decay. But a roving eye for good ideas outside its own walls helped the company spot that very hot brand and target it as a worthy acquisition, in 2012. (L’Or?al’s robust free cash flow, well north of $3.5 billion in each of the past four years, makes pulling the trigger on such acquisitions much easier.)

Perhaps most encouraging of all for the leaders of beauty startups, L’Or?al presents itself as a company that won’t mess with your brainchild after adopting it. “We offer [acquisition targets] the total respect of the identity, culture, spirit and soul of the brand,” Agon told Fortune. Music to a founder’s ears.

What’s Agon’s textbook example of success on the brand-integrity front? Kiehl’s, the New York boutique skin care brand, which L’Or?al has owned since 2000. It turns out there’s a bottle of their lotion in my travel bag. (Trust me, the scent is masculine.) So I guess I’m a L’Or?al customer after all. Who’s basic now? – Matt Heimer

File this in the “whoops” folder: Yesterday’s Term Sheet incorrectly stated that the VC firm Rokk3r Fuel has closed its inaugural fund. It is seeking to raise $150 million; it has not yet raised that amount. Additionally, in the item on Spectrum Equity and Cressey & Co.’s growth investment in Verisys, the hyperlink was incorrect. (Here’s the correct one). Apologies!

THE LATEST FROM FORTUNE…

[ts_bullet_primary] For advertisers, Instagram > Snapchat.

[ts_bullet_primary] Ex-Zenefits CEO Parker Conrad just launched a new HR startup.

[ts_bullet_primary] Donald Trump’s tax return showed he paid more than people thought. But it also suggests he understated his salary by millions.

[ts_bullet_primary] Big Food reconsiders its relationship with sugar, fat, and salt.

[ts_bullet_primary] The health startup trying to take on the multibillion-dollar diet industry.

[ts_bullet_primary] Commentary: Airbnb is way more competitive than Uber.

[ts_bullet_primary] Clifton Leaf has been named editor-in-chief of Fortune.

…AND ELSEWHERE

Domino’s high-tech $9 billion pizza empire. The toxic trouble brewing at Thinx. Lucky Peach‘s days are numbered. What if the future of fashion is spider silk. New questions arise on the safety of a Monsanto weed killer.

VENTURE DEALS

[ts_bullet_primary] ServiceTitan, a Glendale, Calif. provider of business management software for plumbing and electrical service companies, raised $80 million in a Series B funding. ICONIQ Capital led the round.

[ts_bullet_primary] Visier, a San Jose, Calif.-based provider of workforce intelligence software, raised $45 million in Series D funding. Sorenson Capital led the round, and was joined by Foundation Capital, Summit Partners, and Adams Street Partners.

[ts_bullet_primary] Innovium, a San Jose, Calif. provider of networking silicon solutions for data centers, raised $38.3 million in Series C funding. Redline Capital led the round, and was joined by Greylock Partners, Walden Riverwood Ventures, Capricorn Investment Group, Qualcomm Ventures, and S-Cubed Capital.

[ts_bullet_primary] Evrythng, a New York-based IoT platform, raised $24.8 million in Series B funding. Sway Ventures led the round, and was joined by Generation Ventures and BLOC Ventures.

[ts_bullet_primary] .

[ts_bullet_primary] Evolv Technology, a Waltham, Mass. security platform, raised $18 million in Series B funding. Investors include Lux Capital, Bill Gates, General Catalyst, and DCVC.

[ts_bullet_primary] Infoworks, a San Jose, Calif. platform for end-to-end data warehousing, raised $15 million in Series B funding. Centerview Capital Technology led the round, and was joined by Nexus Venture Partners.

[ts_bullet_primary] Flow, a Hoboken, N.J. provider of a platform for cross-border commerce, raised $13 million in Series A funding from Bain Capital Ventures.

[ts_bullet_primary] Dyadic Security, an Israeli security software provider, raised $12 million in Series B funding. Goldman Sachs Principal Strategic Investments, Citi Ventures, and Innovation Endeavors led the round.

[ts_bullet_primary] LimeBike, a San Francisco bike-sharing network, raised $12 million in Series A funding. Andreessen Horowitz led the round, and was joined by Stanford University, IDG, and DCM.

[ts_bullet_primary] Bringg, an Israeli logistics platform for enterprises, raised $10 million in funding. Aleph VC led the round, and was joined by Coca-Cola (NYSE:KO) and Pereg Ventures.

[ts_bullet_primary] Goodlord, a London software provider for real estate agents and renters, raised ?7.2 million ($8.9 million) from Ribbit Capital, LocalGlobe, and Global Founders Capital.

[ts_bullet_primary] Liven, an Australian hospitality-tech startup, raised A$10 million ($7.6 million) in funding from an unnamed Melbourne-based venture capital firm.

HEALTH + LIFE SCIENCES DEALS

[ts_bullet_primary] CellAegis Devices, a Toronto medical device company, raised $ 9.5 million in Series C funding.

[ts_bullet_primary] NeuroOne, a Minnetonka, Minn. developer of neuromonitoring and neuromodulating products, raised $1.2 in seed funding from investors including FundRx.

PRIVATE EQUITY DEALS

[ts_bullet_primary] Bain Capital is close to acquiring .

[ts_bullet_primary] .

[ts_bullet_primary] Atlantic Street Capital acquired a stake in Planet Fit Indy 10, which operates Planet Fitness (NYSE:PLNT) health clubs in the greater Indianapolis area.

[ts_bullet_primary] Excelligence Learning Corporation, which is backed by Brentwood Associates, acquired ChildCare Education Institute, a Duluth, Ga. provider of online training and certificates for the early child care and education market. Financial terms weren’t disclosed.

[ts_bullet_primary] Cedar Springs Capital and Crestline Investors acquired a majority stake in CarePayment, a Nashville, Tenn.-based company that helps patients manage their health-care expenses.

[ts_bullet_primary] Shore Capital Partners formed and invested in EyeSouth Partners, a provider of support services to affiliated eye care practices.

OTHER DEALS

[ts_bullet_primary] Euronet Worldwide (Nasdaq:EEFT) offered $1 billion to acquire rival .

[ts_bullet_primary] Citrix Systems (Nasdaq:CTXS), which has a market cap of $13.2 billion, is considering strategic alternatives, such as putting itself up for sale. Read more at Fortune.

[ts_bullet_primary] Volkswagen (XTRA:VOW3), still struggling from the fall out of its emission scandal, indicated it could be open to a merger with Fiat Chrysler (BIT:FCA). Read more at Fortune.

[ts_bullet_primary] Neiman Marcus, a Dallas-based luxury fashion retailer, is putting itself up for sale. Read more at Fortune.

[ts_bullet_primary] American Securities agreed to acquire Air Methods Corporation (Nasdaq:AIRM) for an enterprise value of about $2.5 billion. At a price of $43 per Air Methods share, American Securities’ offer represents a 20.4% premium on the company’s stock price on January 31, 2017 prior to news regarding a sale.

[ts_bullet_primary] TechStyle Fashion Group, which owns the Kate Hudson-fronted .

[ts_bullet_primary] Harmony Merger (Nasdaq:HRMN), an acquisition company, agreed to merge with .

IPOS

[ts_bullet_primary] .

EXITS

[ts_bullet_primary] Active Interest Media, an El Segundo, Calif. Medica company backed by Wind Point Partners, sold Yachting Promotions, a Fort Lauderdale, Fla. operator of yachting and boat shows in the U.S., to Informa (LSE: INF.L).

FIRMS + FUNDS

[ts_bullet_primary] Bill Maris, the founder and ex-CEO of Google Ventures, is starting a venture fund after all, Bloomberg reports. After walking away from a $230 million health-focused fund late last year, Maris has decided to launch a new $100 million fund that will also focus on biotech and health-related companies. .

PEOPLE

[ts_bullet_primary] Mike Mogul joined Healthpoint Capital as a member of the firm’s leadership team. Mogul is the former CEO of DJO Global.

[ts_bullet_primary] Peter Coroneos joined Z Capital Group as a managing director and the global head of corporate development.

[ts_bullet_primary] B Capital Group announced a series of new hires: Kabir Narang joined the firm as an investment partner, Virginia Schmitt as chief financial officer and chief administrative officer, Chip Welsh and Dave Gallon as vice presidents, and Hailey Hu as a senior associate. In addition, the firm promoted Adam Seabrook from senior associate to principal.

SHARE TODAY’S TERM SHEET

Term Sheet is produced by Laura Entis. Submit deal items here. View this email in your browser.

Read more here:: fortune.com/tech/feed/

The post Term Sheet — Wednesday, March 15 appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Tryst Energy makes IoT-based solar panel at the size of an SD-card for €59

Tryst Light Energy is energy harvesting hardware intended for IoT devices. It needs four hours of sunshine (with relatively small brightness of only 200 lux) to charge IoT-based sensors for 24 hours.

The startup launched on Kickstarter is offering the environment edition, movement edition and dev edition priced at €74, €69, and €59 respectively.

A feature list of the three editions can be viewed on the company’s Kickstarter campaign page.

Tryst launched its Kickstarter fundraising campaign on March 9th, 2017. With 77 backers and 27 days to go, Tryst had €4,981 pledged of €30,000 goal at the time of reporting.

Majority of IoT solutions require batteries to power the sensors. Batteries need to be charged, replaced and maintained. Another disadvantage of battery-powered IoT solutions is that a battery charges slow, and wears out after some charges. Tryst claims to eliminate the need for batteries.

Its Environment edition lets you measure temperature, humidity and the amount of CO2 in the air. The movement edition measures motion, finds your location and records the altitude at which you are. The dev edition is for developers, which contains only the basics (programmable. connectivity & the Light Energy module).

The device contains an energy module, an energy-storage capacitor (Super-Cap), MCU and a radio (Bluetooth 4.2 and LoRaWAN). The real magic is in the Super-Cap that stores around 1.7 mWh for moments without light or peak usage with a life-expectancy of 50,000 cycles. Simply put, the capacitor stores and deploys small amounts of energy super-fast with little to no wear.

Read more here:: feeds.feedburner.com/iot