all about ipv6

The Next Generation Analytics Database – Accelerated by GPUs

By Ana Vasquez

The Next Generation Analytics Database - Accelerated by GPUs

As organizations demand more and more from their analytics and data science teams, processing power looms as one of the fundamental roadblocks to easy success.

Organizations are facing a variety of processing-related data challenges: data centers sprawling to hundreds of nodes to provide processing power; data architects turning to convoluted pipelines to accommodate specialized tools; and business users frustrated by slow BI tools and latent results from batch queries.

A new generation of databases, accelerated by NVIDIA GPUs, are providing the way forward.

GPUs offer thousands of processing cores, and are ideal for general-purpose parallelized computation—in addition to video processing! GPUs differ significantly from standard CPUs in that today’s GPUs have around 4,500 cores (computational units) per device, compared to a CPU which typically has 8 or 16 cores. GPUs are now exploding in popularity in areas such as self-driving cars, medical imaging, computational finance, and bioinformatics, to name a few.

Analytics and data science tasks in particular benefit from parallelized compute on the GPU. Kinetica’s GPU-accelerated database vectorizes queries across the many thousands of GPU cores and can produce results in a fraction of a time compared to standard CPU-constrained databases. Analytics queries, such as SQL aggregations and GROUP BYs, are often reduced to seconds—down from several minutes with other analytics systems. Business users, working with tools such as Tableau or PowerBI, see dashboards reload in an instant—no time to even think about getting coffee!

Solving the compute bottleneck also results in substantially less hardware needs than before. Organizations are replacing 300-node Spark clusters with just 30 nodes of a GPU-accelerated database. They’re running models and queries significantly faster than with any other analytics solution and also benefiting from huge savings on datacenter and data management overhead.

Many financial organizations have been innovating with GPUs for more than five years and are among the first businesses to realize their value. Some of these financial companies are deploying thousands of GPUs to run algorithms — including Monte Carlo simulations — on rapidly changing, streaming trading data to compute risk, for example—essential for regulatory compliance.

A GPU-accelerated database that can natively perform custom computation on data distributed across multiple machines makes it easier for data science teams. Kinetica’s in-database analytics framework provides an API that makes it possible to do compute on the GPU using familiar languages such as Python. Customized data science workloads are now able to be run alongside business analytics in a single solution and on a single copy of the data.

Enterprises can run sophisticated data science workloads in the same database that houses the rich information needed to run the business and drive day-to-day decisions. This neatly solves the data movement challenge because there isn’t any data movement, which leads to simpler Lambda architectures and more efficient Machine Learning and AI workloads. Quants, data scientists, and analysts can deploy a model from a deep learning framework via a simple API call, or train their models on the latest data; users can experience the GPU’s benefits and in-memory processing without needing to learn new programming languages.

With Kinetica, in-database compute can be extended to machine learning libraries such TensorFlow, Caffe (a deep learning framework), and Torch (a machine learning and neural-network framework). These libraries can be extremely compute-hungry, especially on massive datasets, and benefit greatly from GPU horsepower and distributed compute capabilities

GPU-accelerated database solutions can be found in utilities and energy, healthcare, genomics research, automotive, retail, telecommunications, and many other industries. Adopters are combining traditional and transactional data, streaming data, and data from blogs, forums, social media sources, orbital imagery, and other IoT devices in a single, distributed, scalable solution.

Learn more: Advanced In-Database Analytics on the GPU

Download the eBook: How GPUs are Defining the Future of Data Analytics

Pick up your free copy of the new O’Reilly book “Introduction to GPUs for Data Analytics” from Kinetica booth #825 at Strata Data Conference in NY next week.

The post The Next Generation Analytics Database – Accelerated by GPUs appeared first on Datanami.

Read more here:: www.datanami.com/feed/

The Next Generation Analytics Database – Accelerated by GPUs

By News Aggregator

The Next Generation Analytics Database - Accelerated by GPUs

By Ana Vasquez

As organizations demand more and more from their analytics and data science teams, processing power looms as one of the fundamental roadblocks to easy success.

Organizations are facing a variety of processing-related data challenges: data centers sprawling to hundreds of nodes to provide processing power; data architects turning to convoluted pipelines to accommodate specialized tools; and business users frustrated by slow BI tools and latent results from batch queries.

A new generation of databases, accelerated by NVIDIA GPUs, are providing the way forward.

GPUs offer thousands of processing cores, and are ideal for general-purpose parallelized computation—in addition to video processing! GPUs differ significantly from standard CPUs in that today’s GPUs have around 4,500 cores (computational units) per device, compared to a CPU which typically has 8 or 16 cores. GPUs are now exploding in popularity in areas such as self-driving cars, medical imaging, computational finance, and bioinformatics, to name a few.

Analytics and data science tasks in particular benefit from parallelized compute on the GPU. Kinetica’s GPU-accelerated database vectorizes queries across the many thousands of GPU cores and can produce results in a fraction of a time compared to standard CPU-constrained databases. Analytics queries, such as SQL aggregations and GROUP BYs, are often reduced to seconds—down from several minutes with other analytics systems. Business users, working with tools such as Tableau or PowerBI, see dashboards reload in an instant—no time to even think about getting coffee!

Solving the compute bottleneck also results in substantially less hardware needs than before. Organizations are replacing 300-node Spark clusters with just 30 nodes of a GPU-accelerated database. They’re running models and queries significantly faster than with any other analytics solution and also benefiting from huge savings on datacenter and data management overhead.

Many financial organizations have been innovating with GPUs for more than five years and are among the first businesses to realize their value. Some of these financial companies are deploying thousands of GPUs to run algorithms — including Monte Carlo simulations — on rapidly changing, streaming trading data to compute risk, for example—essential for regulatory compliance.

A GPU-accelerated database that can natively perform custom computation on data distributed across multiple machines makes it easier for data science teams. Kinetica’s in-database analytics framework provides an API that makes it possible to do compute on the GPU using familiar languages such as Python. Customized data science workloads are now able to be run alongside business analytics in a single solution and on a single copy of the data.

Enterprises can run sophisticated data science workloads in the same database that houses the rich information needed to run the business and drive day-to-day decisions. This neatly solves the data movement challenge because there isn’t any data movement, which leads to simpler Lambda architectures and more efficient Machine Learning and AI workloads. Quants, data scientists, and analysts can deploy a model from a deep learning framework via a simple API call, or train their models on the latest data; users can experience the GPU’s benefits and in-memory processing without needing to learn new programming languages.

With Kinetica, in-database compute can be extended to machine learning libraries such TensorFlow, Caffe (a deep learning framework), and Torch (a machine learning and neural-network framework). These libraries can be extremely compute-hungry, especially on massive datasets, and benefit greatly from GPU horsepower and distributed compute capabilities

GPU-accelerated database solutions can be found in utilities and energy, healthcare, genomics research, automotive, retail, telecommunications, and many other industries. Adopters are combining traditional and transactional data, streaming data, and data from blogs, forums, social media sources, orbital imagery, and other IoT devices in a single, distributed, scalable solution.

Learn more: Advanced In-Database Analytics on the GPU

Download the eBook: How GPUs are Defining the Future of Data Analytics

Pick up your free copy of the new O’Reilly book “Introduction to GPUs for Data Analytics” from Kinetica booth #825 at Strata Data Conference in NY next week.

The post The Next Generation Analytics Database – Accelerated by GPUs appeared first on Datanami.

Read more here:: www.datanami.com/feed/

The post The Next Generation Analytics Database – Accelerated by GPUs appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

Avenue4 Helps IPv4 Sellers and Buyers Gain Market Access, Overcome Complexities

With more than 30 years combined experience providing legal counsel to the technology industry, Avenue4’s principals have established unique expertise in the IPv4 industry, driving transparency and professionalism in the IPv4 market.

Co-founded by Marc Lindsey and Janine Goodman, Avenue4 possesses a deep understanding of the current market conditions surrounding IPv4 address trading and transfers. Through a broad network of contacts within Fortune 500 organizations, Avenue4 has gathered a significant inventory of IPv4 numbers. Leveraging this inventory and its reputation within the IT and telecom industries, Avenue4 is creating value for sellers and helping buyers make IPv6 adoption decisions that maximize return on their existing IPv4 infrastructure investments.

Understanding the IPv4 Market

Internet Protocol addresses, or IP addresses, are essential to the operation of the Internet. Every device needs an IP address in order to connect to the Internet, and then to communicate with other devices, computers, and services. IPv4 is Version 4 of the Internet Protocol in use today. The finite quantity of IPv4 addresses, which had generally been available (for free) through Regional Internet Registries (RIRs) — such as American Registry for Internet Numbers (ARIN) are exhausted, and additional IPv4 addresses are only available in the North American, European and Asia Pacific regions through trading (or transfers) in the secondary market.

The next-generation Internet Protocol, IPv6, provides a near limitless free supply of IP addresses from the RIRs. However, IPv6 is not backward compatible with IPv4, which currently dominates the vast majority of Internet traffic. Migration to IPv6 can be costly — requiring significant upgrades to an organization’s IP network infrastructure (e.g., installing and configuring IPv6 routers, switches, firewalls, other security devices, and enhancing IP-enabled software, and then running both IPv4 and IPv6 networks concurrently). As a result, the global migration to IPv6 has progressed slowly — with many organizations planning their IPv6 deployments as long-term projects. Demand for IPv4 numbers will, therefore, likely continue to be strong for several more years.

Supplying Choice

Avenue4 specializes in connecting buyers and sellers of IPv4 addresses and provides access to a supply of IPv4 address space. The availability of this supply provides organizations with a viable choice to support their existing networks while the extended migration to IPv6 is underway. Although the supply of IPv4 address space has contracted relative to demand over the last 12 months, the IPv4 trading market provides network operators breathing room to develop and execute IPv6 deployment plans that are appropriate for their businesses.

Expertise Needed

Organizations in need of IPv4 addresses can purchase them from entities with unused addresses, and the transfer of control resulting from such sales can be effectively recorded in the regional Internet Registry (RIR) system pursuant to their market-based registration transfer policies. However, structuring and closing transactions can be complex, and essential information necessary to make smart buy/sell decisions are not readily available. Succeeding in the market requires advisors with up-to-date knowledge about the nuances of the commercial, contractual, and Internet governance policies that shape IPv4 market transactions. With its deep experience, Avenue4 LLC cultivates transactions most likely to reach closure, structures creative and value-enhancing arrangements, and then takes those transactions to closure through the negotiation and registration transfer processes. By successfully navigating these challenges to broker, structure and negotiate some of the largest and most complex IPv4 transactions to date, Avenue4 has emerged as one of the industry’s most trusted IPv4 market advisors.

Avenue4 has focused on providing the counsel and guidance necessary to complete high-value transactions that meet the sellers’ market objectives, and provide buyers with flexibility and choice. When Avenue4 is engaged in deals, we believe that sellers and buyers should feel confident that the transactions we originate will be structured with market-leading terms, executed ethically, and closed protecting the negotiated outcome.

Avenue4’s leadership team has advised some of the largest and most sophisticated holders of IPv4 number blocks. The principals of Avenue4, however, believe that technology enabled services are the key to making the market more accessible to all participants. With the launch of its new online trading platform, ACCELR/8, Avenue4 is now bringing the same level of expertise and process maturity to the small and mid-size block market.

Read more here:: www.circleid.com/rss/topics/ipv6

Avenue4 Helps IPv4 Sellers and Buyers Gain Market Access, Overcome Complexities

By News Aggregator

With more than 30 years combined experience providing legal counsel to the technology industry, Avenue4’s principals have established unique expertise in the IPv4 industry, driving transparency and professionalism in the IPv4 market.

Co-founded by Marc Lindsey and Janine Goodman, Avenue4 possesses a deep understanding of the current market conditions surrounding IPv4 address trading and transfers. Through a broad network of contacts within Fortune 500 organizations, Avenue4 has gathered a significant inventory of IPv4 numbers. Leveraging this inventory and its reputation within the IT and telecom industries, Avenue4 is creating value for sellers and helping buyers make IPv6 adoption decisions that maximize return on their existing IPv4 infrastructure investments.

Understanding the IPv4 Market

Internet Protocol addresses, or IP addresses, are essential to the operation of the Internet. Every device needs an IP address in order to connect to the Internet, and then to communicate with other devices, computers, and services. IPv4 is Version 4 of the Internet Protocol in use today. The finite quantity of IPv4 addresses, which had generally been available (for free) through Regional Internet Registries (RIRs) — such as American Registry for Internet Numbers (ARIN) are exhausted, and additional IPv4 addresses are only available in the North American, European and Asia Pacific regions through trading (or transfers) in the secondary market.

The next-generation Internet Protocol, IPv6, provides a near limitless free supply of IP addresses from the RIRs. However, IPv6 is not backward compatible with IPv4, which currently dominates the vast majority of Internet traffic. Migration to IPv6 can be costly — requiring significant upgrades to an organization’s IP network infrastructure (e.g., installing and configuring IPv6 routers, switches, firewalls, other security devices, and enhancing IP-enabled software, and then running both IPv4 and IPv6 networks concurrently). As a result, the global migration to IPv6 has progressed slowly — with many organizations planning their IPv6 deployments as long-term projects. Demand for IPv4 numbers will, therefore, likely continue to be strong for several more years.

Supplying Choice

Avenue4 specializes in connecting buyers and sellers of IPv4 addresses and provides access to a supply of IPv4 address space. The availability of this supply provides organizations with a viable choice to support their existing networks while the extended migration to IPv6 is underway. Although the supply of IPv4 address space has contracted relative to demand over the last 12 months, the IPv4 trading market provides network operators breathing room to develop and execute IPv6 deployment plans that are appropriate for their businesses.

Expertise Needed

Organizations in need of IPv4 addresses can purchase them from entities with unused addresses, and the transfer of control resulting from such sales can be effectively recorded in the regional Internet Registry (RIR) system pursuant to their market-based registration transfer policies. However, structuring and closing transactions can be complex, and essential information necessary to make smart buy/sell decisions are not readily available. Succeeding in the market requires advisors with up-to-date knowledge about the nuances of the commercial, contractual, and Internet governance policies that shape IPv4 market transactions. With its deep experience, Avenue4 LLC cultivates transactions most likely to reach closure, structures creative and value-enhancing arrangements, and then takes those transactions to closure through the negotiation and registration transfer processes. By successfully navigating these challenges to broker, structure and negotiate some of the largest and most complex IPv4 transactions to date, Avenue4 has emerged as one of the industry’s most trusted IPv4 market advisors.

Avenue4 has focused on providing the counsel and guidance necessary to complete high-value transactions that meet the sellers’ market objectives, and provide buyers with flexibility and choice. When Avenue4 is engaged in deals, we believe that sellers and buyers should feel confident that the transactions we originate will be structured with market-leading terms, executed ethically, and closed protecting the negotiated outcome.

Avenue4’s leadership team has advised some of the largest and most sophisticated holders of IPv4 number blocks. The principals of Avenue4, however, believe that technology enabled services are the key to making the market more accessible to all participants. With the launch of its new online trading platform, ACCELR/8, Avenue4 is now bringing the same level of expertise and process maturity to the small and mid-size block market.

Read more here:: www.circleid.com/rss/topics/ipv6

The post Avenue4 Helps IPv4 Sellers and Buyers Gain Market Access, Overcome Complexities appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

ProLabs to debut new Green transceivers at ECOC 2017

By Sheetal Kumbhar

ProLabs, provider of optical network infrastructure, will be at this year’s ECOC Exhibition in Gothenburg, Sweden, where it will showcase its full portfolio of connectivity solutions, including an innovative new product line. Launching at ECOC will be ProLabs’ line of new Green transceivers, which utilise new technologies resulting in 30% less power consumed than a […]

The post ProLabs to debut new Green transceivers at ECOC 2017 appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

ProLabs to debut new Green transceivers at ECOC 2017

By News Aggregator

By Sheetal Kumbhar

ProLabs, provider of optical network infrastructure, will be at this year’s ECOC Exhibition in Gothenburg, Sweden, where it will showcase its full portfolio of connectivity solutions, including an innovative new product line. Launching at ECOC will be ProLabs’ line of new Green transceivers, which utilise new technologies resulting in 30% less power consumed than a […]

The post ProLabs to debut new Green transceivers at ECOC 2017 appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post ProLabs to debut new Green transceivers at ECOC 2017 appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

PTC to move its global headquarters to Boston Seaport

By Sheetal Kumbhar

PTC, announced plans to move its global headquarters to 121 Seaport in Boston’s Seaport District. In so doing, PTC will bring more than 1,000 employees and a 30-year heritage of technology innovation to the site of the nation’s first-ever Innovation District. “There are few places in the world that can lay claim to as many ‘firsts’ […]

The post PTC to move its global headquarters to Boston Seaport appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

PTC to move its global headquarters to Boston Seaport

By News Aggregator

By Sheetal Kumbhar

PTC, announced plans to move its global headquarters to 121 Seaport in Boston’s Seaport District. In so doing, PTC will bring more than 1,000 employees and a 30-year heritage of technology innovation to the site of the nation’s first-ever Innovation District. “There are few places in the world that can lay claim to as many ‘firsts’ […]

The post PTC to move its global headquarters to Boston Seaport appeared first on IoT Now – How to run an IoT enabled business.

Read more here:: www.m2mnow.biz/feed/

The post PTC to move its global headquarters to Boston Seaport appeared on IPv6.net.

Read more here:: IPv6 News Aggregator

IoT e raccolta dei dati traineranno il settore storage

By Gianluca Pisutu

Internet of Things (IoT) e storage

Di Internet of Things (IoT) si parla ormai da diversi anni e per un certo periodo è stata una delle buzzword per eccellenza del settore IT. Nonostante il boom del promettente settore non sia ancora arrivato, 451 Research afferma che nei prossimi dodici mesi le aziende interessate a progetti IoT aumenteranno considerevolmente i budget destinati all’IoT: tra le principali voci di spesa lo storage (32.6% delle preferenze), network edge equipment (30.2%), server (29.4%) e servizi cloud SaaS (27.2%).

E’ invece interessante notare come le aziende interpellate preferiscano archiviare i dati in infrastrutture proprietarie (53.1% del campione) piuttosto che affidarsi al cloud pubblico. Una volta che i dati grezzi (raw data) sono stati analizzati, la percentuale sale fino a 2/3 del totale.

L’edge computing, altra buzzword sulla quale si è insistito molto durante la convention Microsoft Build 2017 (“le risorse di calcolo si stanno spostando dove vengono generati i dati” ha affermato Satia Nadella), sembra ancora una pratica poco diffusa: solo il 22.2% del campione affida l’analisi ai dispositivi IoT; le infrastrutture IT nelle vicinanze dei punti di raccolta sono adoperate nel 23.3% dei casi.

While some enterprises say that in the future they will do more analytics – including heavy data processing and analysis driven by big data or AI – at the network edge, for now that deeper analysis is happening in company-owned data centers or in the public cloud

ha commentato a riguardo un analista.

IoT: casi di utilizzo e mercato del lavoro

I casi di utilizzo IoT sono stati suddivisi in attuali e futuri. Per quanto riguarda i primi, 451 Research indica monitoraggio della sicurezza, sorveglianza e gestione dei data center. Nei prossimi due anni i progetti IoT si occuperanno inoltre di automazione nelle infrastrutture operations-focused e di line-of-business-centric supply chain management.

Un problema che affligge anche questo settore dell’industria è la difficoltà nel reperire personale qualificato (secondo la metà del campione). Le competenze più richieste sono quelle inerenti l’analisi dei dati, la virtualizzazione e la sicurezza.

Soprattutto quest’ultima è un argomento delicato per chi opera nel settore: l’assenza di precise direttive comuni in merito si sta rivelando controproducente alimentando i timori degli sviluppatori ed avvantaggiando i sempre presenti hacker – l’attacco record subito da DYN era stato lanciato grazie alla monomissione degli elementari sistemi di protezione adoperati da telecamere di sorveglianza connesse alla Rete.

Fonte: 1

Read more here:: www.hostingtalk.it/feed/

IoT e raccolta dei dati traineranno il settore storage

By News Aggregator

Internet of Things (IoT) e storage

By Gianluca Pisutu

Di Internet of Things (IoT) si parla ormai da diversi anni e per un certo periodo è stata una delle buzzword per eccellenza del settore IT. Nonostante il boom del promettente settore non sia ancora arrivato, 451 Research afferma che nei prossimi dodici mesi le aziende interessate a progetti IoT aumenteranno considerevolmente i budget destinati all’IoT: tra le principali voci di spesa lo storage (32.6% delle preferenze), network edge equipment (30.2%), server (29.4%) e servizi cloud SaaS (27.2%).

E’ invece interessante notare come le aziende interpellate preferiscano archiviare i dati in infrastrutture proprietarie (53.1% del campione) piuttosto che affidarsi al cloud pubblico. Una volta che i dati grezzi (raw data) sono stati analizzati, la percentuale sale fino a 2/3 del totale.

L’edge computing, altra buzzword sulla quale si è insistito molto durante la convention Microsoft Build 2017 (“le risorse di calcolo si stanno spostando dove vengono generati i dati” ha affermato Satia Nadella), sembra ancora una pratica poco diffusa: solo il 22.2% del campione affida l’analisi ai dispositivi IoT; le infrastrutture IT nelle vicinanze dei punti di raccolta sono adoperate nel 23.3% dei casi.

While some enterprises say that in the future they will do more analytics – including heavy data processing and analysis driven by big data or AI – at the network edge, for now that deeper analysis is happening in company-owned data centers or in the public cloud

ha commentato a riguardo un analista.

IoT: casi di utilizzo e mercato del lavoro

I casi di utilizzo IoT sono stati suddivisi in attuali e futuri. Per quanto riguarda i primi, 451 Research indica monitoraggio della sicurezza, sorveglianza e gestione dei data center. Nei prossimi due anni i progetti IoT si occuperanno inoltre di automazione nelle infrastrutture operations-focused e di line-of-business-centric supply chain management.

Un problema che affligge anche questo settore dell’industria è la difficoltà nel reperire personale qualificato (secondo la metà del campione). Le competenze più richieste sono quelle inerenti l’analisi dei dati, la virtualizzazione e la sicurezza.

Soprattutto quest’ultima è un argomento delicato per chi opera nel settore: l’assenza di precise direttive comuni in merito si sta rivelando controproducente alimentando i timori degli sviluppatori ed avvantaggiando i sempre presenti hacker – l’attacco record subito da DYN era stato lanciato grazie alla monomissione degli elementari sistemi di protezione adoperati da telecamere di sorveglianza connesse alla Rete.

Fonte: 1

Read more here:: www.hostingtalk.it/feed/

The post IoT e raccolta dei dati traineranno il settore storage appeared on IPv6.net.

Read more here:: IPv6 News Aggregator