buy ipv6

Uber, Lyft, & other ride sharing services to see driver numbers double, reaching 8.6 million by 2022

By Sheetal Kumbhar

A new study from Juniper Research has found that a surge in shared transport will continue, with driver and passenger numbers seeing substantial growth over the next 5 years. The new research, Sharing Economy: Opportunities, Impacts & Disruptors 2017-2022, forecasts that the number of ride sharing drivers will increase from an estimated 4.3 million in 2017, to 8.6 […]

The post Uber, Lyft, & other ride sharing services to see driver numbers double, reaching 8.6 million by 2022 appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Uber, Lyft, & other ride sharing services to see driver numbers double, reaching 8.6 million by 2022

By News Aggregator

By Sheetal Kumbhar

A new study from Juniper Research has found that a surge in shared transport will continue, with driver and passenger numbers seeing substantial growth over the next 5 years. The new research, Sharing Economy: Opportunities, Impacts & Disruptors 2017-2022, forecasts that the number of ride sharing drivers will increase from an estimated 4.3 million in 2017, to 8.6 […]

The post Uber, Lyft, & other ride sharing services to see driver numbers double, reaching 8.6 million by 2022 appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post Uber, Lyft, & other ride sharing services to see driver numbers double, reaching 8.6 million by 2022 appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Hitachi Undergoes Major Reorganization to Compete With GE

By Barb Darrow

Hitachi the global Tokyo-based conglomerate, is combining three of its U.S.-based tech units–Hitachi Data Systems, Hitachi Insight Group, and Pentaho–into a single company to be called Hitachi Vantara.

The new company will compete in the Internet of things sector against such players as General Electric


ge



, as well as Microsoft


msft



, Amazon


amzn



, and Google


googl



–all of which field their own IoT platforms, says Stacy Crook, research director at IDC.

However, Hitachi Vantara execs say it will not complete directly with Microsoft Azure, Amazon Web Services, or Google Cloud Platform to sell basic computing, storage, and networking to customers.

“This is all about standing up end-to-end solutions that can run on-premises or on Azure or Amazon,” says Bobbi Soni, Vantara’s chief solutions and services officer and another HDS veteran. “Our strategy is to bring industrial IoT expertise, our software and ability to manage and run technology. We don’t think many companies are in a position to pull all those things together.”

Related: Amazon Sets Sights on Huge Internet of Things Opportunity

With the reorganization, one entity will sell HDS’s data center infrastructure, Insight’s big data software, and Pentaho analytics. Together, those companies claim annual revenue of $4 billion. HDS purchased Pentaho in 2015 for a reported $500 million to $600 million.

Hitachi Vantara will be a wholly owned subsidiary of Hitachi Ltd., but will be independently managed, Soni tells Fortune in advance of the announcement at Hitachi’s Next Conference in Las Vegas on Tuesday. Vantara management also looks a lot like HDS management: HDS chief executive Ryuichi Otsuki and president/chief operating officer Brian Householder are assuming the same positions at the new company. The company will also be bsed in HDS’s Santa Clara, Calif. headquarters.

Together, the three component companies claim customers including Disney


dis



, BMW


bmwyy



, Verizon


vz



, InfoSys, and Marks & Spencer.

Get Data Sheet, Fortune’s daily tech newsletter.

One priority will be to push Insight’s Lumada Internet of things technology as a platform for connected devices in industrial settings. In the industrial Internet of things, sensors on factory floors, in mines, in airplane engines, or some other remote or hostile location are designed to send data to an aggregation point, where it can be aggregated and analyzed in order to help manufacturers fix problems before they become dangerous.

Related: GE Kills Plans to Built Its Amazon-like Cloud

To date, Lumada itself has been more of a roadmap than a sellable product. But that will change now, says IDC’s Crook. “The big reveal is now that Lumada will be a standalone commercial offering and sit at the middle of Hitachi’s IoT strategy,” she says.

Hitachi, Crook explains, is bringing together all these related assets it already had and telling a better story about how they work together.

Read more here:: fortune.com/tech/feed/

Hitachi Undergoes Major Reorganization to Compete With GE

By News Aggregator

By Barb Darrow

Hitachi the global Tokyo-based conglomerate, is combining three of its U.S.-based tech units–Hitachi Data Systems, Hitachi Insight Group, and Pentaho–into a single company to be called Hitachi Vantara.

The new company will compete in the Internet of things sector against such players as General Electric


ge



, as well as Microsoft


msft



, Amazon


amzn



, and Google


googl



–all of which field their own IoT platforms, says Stacy Crook, research director at IDC.

However, Hitachi Vantara execs say it will not complete directly with Microsoft Azure, Amazon Web Services, or Google Cloud Platform to sell basic computing, storage, and networking to customers.

“This is all about standing up end-to-end solutions that can run on-premises or on Azure or Amazon,” says Bobbi Soni, Vantara’s chief solutions and services officer and another HDS veteran. “Our strategy is to bring industrial IoT expertise, our software and ability to manage and run technology. We don’t think many companies are in a position to pull all those things together.”

Related: Amazon Sets Sights on Huge Internet of Things Opportunity

With the reorganization, one entity will sell HDS’s data center infrastructure, Insight’s big data software, and Pentaho analytics. Together, those companies claim annual revenue of $4 billion. HDS purchased Pentaho in 2015 for a reported $500 million to $600 million.

Hitachi Vantara will be a wholly owned subsidiary of Hitachi Ltd., but will be independently managed, Soni tells Fortune in advance of the announcement at Hitachi’s Next Conference in Las Vegas on Tuesday. Vantara management also looks a lot like HDS management: HDS chief executive Ryuichi Otsuki and president/chief operating officer Brian Householder are assuming the same positions at the new company. The company will also be bsed in HDS’s Santa Clara, Calif. headquarters.

Together, the three component companies claim customers including Disney


dis



, BMW


bmwyy



, Verizon


vz



, InfoSys, and Marks & Spencer.

Get Data Sheet, Fortune’s daily tech newsletter.

One priority will be to push Insight’s Lumada Internet of things technology as a platform for connected devices in industrial settings. In the industrial Internet of things, sensors on factory floors, in mines, in airplane engines, or some other remote or hostile location are designed to send data to an aggregation point, where it can be aggregated and analyzed in order to help manufacturers fix problems before they become dangerous.

Related: GE Kills Plans to Built Its Amazon-like Cloud

To date, Lumada itself has been more of a roadmap than a sellable product. But that will change now, says IDC’s Crook. “The big reveal is now that Lumada will be a standalone commercial offering and sit at the middle of Hitachi’s IoT strategy,” she says.

Hitachi, Crook explains, is bringing together all these related assets it already had and telling a better story about how they work together.

Read more here:: fortune.com/tech/feed/

The post Hitachi Undergoes Major Reorganization to Compete With GE appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

The Next Generation Analytics Database – Accelerated by GPUs

By Ana Vasquez

The Next Generation Analytics Database - Accelerated by GPUs

As organizations demand more and more from their analytics and data science teams, processing power looms as one of the fundamental roadblocks to easy success.

Organizations are facing a variety of processing-related data challenges: data centers sprawling to hundreds of nodes to provide processing power; data architects turning to convoluted pipelines to accommodate specialized tools; and business users frustrated by slow BI tools and latent results from batch queries.

A new generation of databases, accelerated by NVIDIA GPUs, are providing the way forward.

GPUs offer thousands of processing cores, and are ideal for general-purpose parallelized computation—in addition to video processing! GPUs differ significantly from standard CPUs in that today’s GPUs have around 4,500 cores (computational units) per device, compared to a CPU which typically has 8 or 16 cores. GPUs are now exploding in popularity in areas such as self-driving cars, medical imaging, computational finance, and bioinformatics, to name a few.

Analytics and data science tasks in particular benefit from parallelized compute on the GPU. Kinetica’s GPU-accelerated database vectorizes queries across the many thousands of GPU cores and can produce results in a fraction of a time compared to standard CPU-constrained databases. Analytics queries, such as SQL aggregations and GROUP BYs, are often reduced to seconds—down from several minutes with other analytics systems. Business users, working with tools such as Tableau or PowerBI, see dashboards reload in an instant—no time to even think about getting coffee!

Solving the compute bottleneck also results in substantially less hardware needs than before. Organizations are replacing 300-node Spark clusters with just 30 nodes of a GPU-accelerated database. They’re running models and queries significantly faster than with any other analytics solution and also benefiting from huge savings on datacenter and data management overhead.

Many financial organizations have been innovating with GPUs for more than five years and are among the first businesses to realize their value. Some of these financial companies are deploying thousands of GPUs to run algorithms — including Monte Carlo simulations — on rapidly changing, streaming trading data to compute risk, for example—essential for regulatory compliance.

A GPU-accelerated database that can natively perform custom computation on data distributed across multiple machines makes it easier for data science teams. Kinetica’s in-database analytics framework provides an API that makes it possible to do compute on the GPU using familiar languages such as Python. Customized data science workloads are now able to be run alongside business analytics in a single solution and on a single copy of the data.

Enterprises can run sophisticated data science workloads in the same database that houses the rich information needed to run the business and drive day-to-day decisions. This neatly solves the data movement challenge because there isn’t any data movement, which leads to simpler Lambda architectures and more efficient Machine Learning and AI workloads. Quants, data scientists, and analysts can deploy a model from a deep learning framework via a simple API call, or train their models on the latest data; users can experience the GPU’s benefits and in-memory processing without needing to learn new programming languages.

With Kinetica, in-database compute can be extended to machine learning libraries such TensorFlow, Caffe (a deep learning framework), and Torch (a machine learning and neural-network framework). These libraries can be extremely compute-hungry, especially on massive datasets, and benefit greatly from GPU horsepower and distributed compute capabilities

GPU-accelerated database solutions can be found in utilities and energy, healthcare, genomics research, automotive, retail, telecommunications, and many other industries. Adopters are combining traditional and transactional data, streaming data, and data from blogs, forums, social media sources, orbital imagery, and other IoT devices in a single, distributed, scalable solution.

Learn more: Advanced In-Database Analytics on the GPU

Download the eBook: How GPUs are Defining the Future of Data Analytics

Pick up your free copy of the new O’Reilly book “Introduction to GPUs for Data Analytics” from Kinetica booth #825 at Strata Data Conference in NY next week.

The post The Next Generation Analytics Database – Accelerated by GPUs appeared first on Datanami.

Read more here:: www.datanami.com/feed/

The Next Generation Analytics Database – Accelerated by GPUs

By News Aggregator

The Next Generation Analytics Database - Accelerated by GPUs

By Ana Vasquez

As organizations demand more and more from their analytics and data science teams, processing power looms as one of the fundamental roadblocks to easy success.

Organizations are facing a variety of processing-related data challenges: data centers sprawling to hundreds of nodes to provide processing power; data architects turning to convoluted pipelines to accommodate specialized tools; and business users frustrated by slow BI tools and latent results from batch queries.

A new generation of databases, accelerated by NVIDIA GPUs, are providing the way forward.

GPUs offer thousands of processing cores, and are ideal for general-purpose parallelized computation—in addition to video processing! GPUs differ significantly from standard CPUs in that today’s GPUs have around 4,500 cores (computational units) per device, compared to a CPU which typically has 8 or 16 cores. GPUs are now exploding in popularity in areas such as self-driving cars, medical imaging, computational finance, and bioinformatics, to name a few.

Analytics and data science tasks in particular benefit from parallelized compute on the GPU. Kinetica’s GPU-accelerated database vectorizes queries across the many thousands of GPU cores and can produce results in a fraction of a time compared to standard CPU-constrained databases. Analytics queries, such as SQL aggregations and GROUP BYs, are often reduced to seconds—down from several minutes with other analytics systems. Business users, working with tools such as Tableau or PowerBI, see dashboards reload in an instant—no time to even think about getting coffee!

Solving the compute bottleneck also results in substantially less hardware needs than before. Organizations are replacing 300-node Spark clusters with just 30 nodes of a GPU-accelerated database. They’re running models and queries significantly faster than with any other analytics solution and also benefiting from huge savings on datacenter and data management overhead.

Many financial organizations have been innovating with GPUs for more than five years and are among the first businesses to realize their value. Some of these financial companies are deploying thousands of GPUs to run algorithms — including Monte Carlo simulations — on rapidly changing, streaming trading data to compute risk, for example—essential for regulatory compliance.

A GPU-accelerated database that can natively perform custom computation on data distributed across multiple machines makes it easier for data science teams. Kinetica’s in-database analytics framework provides an API that makes it possible to do compute on the GPU using familiar languages such as Python. Customized data science workloads are now able to be run alongside business analytics in a single solution and on a single copy of the data.

Enterprises can run sophisticated data science workloads in the same database that houses the rich information needed to run the business and drive day-to-day decisions. This neatly solves the data movement challenge because there isn’t any data movement, which leads to simpler Lambda architectures and more efficient Machine Learning and AI workloads. Quants, data scientists, and analysts can deploy a model from a deep learning framework via a simple API call, or train their models on the latest data; users can experience the GPU’s benefits and in-memory processing without needing to learn new programming languages.

With Kinetica, in-database compute can be extended to machine learning libraries such TensorFlow, Caffe (a deep learning framework), and Torch (a machine learning and neural-network framework). These libraries can be extremely compute-hungry, especially on massive datasets, and benefit greatly from GPU horsepower and distributed compute capabilities

GPU-accelerated database solutions can be found in utilities and energy, healthcare, genomics research, automotive, retail, telecommunications, and many other industries. Adopters are combining traditional and transactional data, streaming data, and data from blogs, forums, social media sources, orbital imagery, and other IoT devices in a single, distributed, scalable solution.

Learn more: Advanced In-Database Analytics on the GPU

Download the eBook: How GPUs are Defining the Future of Data Analytics

Pick up your free copy of the new O’Reilly book “Introduction to GPUs for Data Analytics” from Kinetica booth #825 at Strata Data Conference in NY next week.

The post The Next Generation Analytics Database – Accelerated by GPUs appeared first on Datanami.

Read more here:: www.datanami.com/feed/

The post The Next Generation Analytics Database – Accelerated by GPUs appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Economics of IoT are ‘increasingly compelling’, says new Verizon report

By Sheetal Kumbhar

The Internet of Things (IoT) is at the core of digital transformation in 2017, with 73% of executives either researching or currently deploying IoT. Manufacturing, transportation and utilities make up the largest percent of investments, while insurance and consumers represent the fastest areas of spending growth. With 8.4 billion connected “things” in use in 2017, […]

The post Economics of IoT are ‘increasingly compelling’, says new Verizon report appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

Economics of IoT are ‘increasingly compelling’, says new Verizon report

By News Aggregator

By Sheetal Kumbhar

The Internet of Things (IoT) is at the core of digital transformation in 2017, with 73% of executives either researching or currently deploying IoT. Manufacturing, transportation and utilities make up the largest percent of investments, while insurance and consumers represent the fastest areas of spending growth. With 8.4 billion connected “things” in use in 2017, […]

The post Economics of IoT are ‘increasingly compelling’, says new Verizon report appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post Economics of IoT are ‘increasingly compelling’, says new Verizon report appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

AWS Partner Ryft Leverages Cloud FPGAs

By News Aggregator

By George Leopold

As more data moves to the cloud, analytics vendors are also embracing new cloud technologies designed to boost performance for emerging workloads such as business intelligence.

With that in mind, analytics specialist Ryft Systems Inc. said this week it would accelerate its business analytics offerings via Amazon Web Service’s (NASDAQ: AMZN) F1 FPGA-based instance announced last fall along with its “elastic GPU” service.

The accelerated business analytics service, now available on the AWS Marketplace, is designed to boost data search and analysis capabilities, including broader “fuzzy” search criteria. Specifically, the FPGA upgrade would allow users to perform advanced searches called Perl Compatible Regular Expressions, or PCRE2, on files before and after they are indexed. The FPGA-accelerated architecture would also allow fuzzy searches and matching on either indexed or un-indexed data, the company said.

Ryft, Rockville, Md., said its technology running on the Amazon F1 instance would help it leverage FPGA-accelerated computing hosted in the AWS cloud. The company released a pair of Amazon machine images on Wednesday (Sept. 13) used to deploy the new FPGA services on the AWS cloud.

The first is a toolkit designed to help users integrate new search and analysis capabilities into existing analytics interfaces or applications. The toolkit also is billed as “turning big data into small data” through analysis and “thinning” without data indexing.

The second image, called “Elasticsearch,” boosts the performance of the ubiquitous Lucene-based search engine by adding PCRE2 regular expression and fuzzy search capabilities. It also accelerates search and analysis across unstructured data and a range of files.

The analytics vendor asserts that data indexing requirements stifle the ability to mine data. The combination of Elasticsearch running on the AWS F1 instance reduced processing and searching a 1-Tb log file from more than 62 hours on a commodity AWS Elastic Compute Cloud server to 0.69 hours on the combined Ryft-F1 instance, the company claimed after benchmark testing. That, Ryft further claimed, equals a 91-fold performance increase.

Other cloud analytics features besides the AWS FPGA instance include a frontend running on the Ubuntu distribution of Linux along with Open API. The Ryft cloud also integrates with Apache Spark and database connectors.

The company added that it is readying software and algorithms to transform the leading public cloud into a “data analytics machine.” Ryft’s middleware layer provides the connectors and algorithms to help speed cloud analytics since, it asserts, a “cloud platforms alone simply doesn’t have what it takes.”

AWS said last November is new F1 instance addresses the continuing data explosion generated by the Internet of Things, video streaming and other demanding workloads. Specifications for the F1 instances pair Intel’s Broadwell E5 2686 v4 processors with up to 976 Gbytes of memory and up to 4 Tb of storage with one to eight FPGAs.

Ryft CEO Des Wilson predicted at the time of the rollout that the “AWS marketplace will embrace heterogeneous computing principles with the advent of its new F1 instance, opening up a whole new set of data analytics capabilities.” Ryft specializes in hybrid FPGA/x86 configurations.

Recent items:

FPGA System Smokes Spark on Streaming Analytics

Neo4j Touts 10x Performance Boost of Graphs on IBM Power FPGAs

The post AWS Partner Ryft Leverages Cloud FPGAs appeared first on Datanami.

Read more here:: www.datanami.com/feed/

The post AWS Partner Ryft Leverages Cloud FPGAs appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Avenue4 Helps IPv4 Sellers and Buyers Gain Market Access, Overcome Complexities

With more than 30 years combined experience providing legal counsel to the technology industry, Avenue4’s principals have established unique expertise in the IPv4 industry, driving transparency and professionalism in the IPv4 market.

Co-founded by Marc Lindsey and Janine Goodman, Avenue4 possesses a deep understanding of the current market conditions surrounding IPv4 address trading and transfers. Through a broad network of contacts within Fortune 500 organizations, Avenue4 has gathered a significant inventory of IPv4 numbers. Leveraging this inventory and its reputation within the IT and telecom industries, Avenue4 is creating value for sellers and helping buyers make IPv6 adoption decisions that maximize return on their existing IPv4 infrastructure investments.

Understanding the IPv4 Market

Internet Protocol addresses, or IP addresses, are essential to the operation of the Internet. Every device needs an IP address in order to connect to the Internet, and then to communicate with other devices, computers, and services. IPv4 is Version 4 of the Internet Protocol in use today. The finite quantity of IPv4 addresses, which had generally been available (for free) through Regional Internet Registries (RIRs) — such as American Registry for Internet Numbers (ARIN) are exhausted, and additional IPv4 addresses are only available in the North American, European and Asia Pacific regions through trading (or transfers) in the secondary market.

The next-generation Internet Protocol, IPv6, provides a near limitless free supply of IP addresses from the RIRs. However, IPv6 is not backward compatible with IPv4, which currently dominates the vast majority of Internet traffic. Migration to IPv6 can be costly — requiring significant upgrades to an organization’s IP network infrastructure (e.g., installing and configuring IPv6 routers, switches, firewalls, other security devices, and enhancing IP-enabled software, and then running both IPv4 and IPv6 networks concurrently). As a result, the global migration to IPv6 has progressed slowly — with many organizations planning their IPv6 deployments as long-term projects. Demand for IPv4 numbers will, therefore, likely continue to be strong for several more years.

Supplying Choice

Avenue4 specializes in connecting buyers and sellers of IPv4 addresses and provides access to a supply of IPv4 address space. The availability of this supply provides organizations with a viable choice to support their existing networks while the extended migration to IPv6 is underway. Although the supply of IPv4 address space has contracted relative to demand over the last 12 months, the IPv4 trading market provides network operators breathing room to develop and execute IPv6 deployment plans that are appropriate for their businesses.

Expertise Needed

Organizations in need of IPv4 addresses can purchase them from entities with unused addresses, and the transfer of control resulting from such sales can be effectively recorded in the regional Internet Registry (RIR) system pursuant to their market-based registration transfer policies. However, structuring and closing transactions can be complex, and essential information necessary to make smart buy/sell decisions are not readily available. Succeeding in the market requires advisors with up-to-date knowledge about the nuances of the commercial, contractual, and Internet governance policies that shape IPv4 market transactions. With its deep experience, Avenue4 LLC cultivates transactions most likely to reach closure, structures creative and value-enhancing arrangements, and then takes those transactions to closure through the negotiation and registration transfer processes. By successfully navigating these challenges to broker, structure and negotiate some of the largest and most complex IPv4 transactions to date, Avenue4 has emerged as one of the industry’s most trusted IPv4 market advisors.

Avenue4 has focused on providing the counsel and guidance necessary to complete high-value transactions that meet the sellers’ market objectives, and provide buyers with flexibility and choice. When Avenue4 is engaged in deals, we believe that sellers and buyers should feel confident that the transactions we originate will be structured with market-leading terms, executed ethically, and closed protecting the negotiated outcome.

Avenue4’s leadership team has advised some of the largest and most sophisticated holders of IPv4 number blocks. The principals of Avenue4, however, believe that technology enabled services are the key to making the market more accessible to all participants. With the launch of its new online trading platform, ACCELR/8, Avenue4 is now bringing the same level of expertise and process maturity to the small and mid-size block market.

Read more here:: www.circleid.com/rss/topics/ipv6