The quality of irrigation water, as well as the correct management of water resources, is essential for the productivity and efficiency of the crops. Controlling and analysing water before irrigating is crucial and its quality may vary significantly depending on the time of the year. So frequent measurements are recommended.
The Spanish company GMV has developed a water quality monitoring system based on Libelium technology. The nodes were installed at the “El Portal” irrigation dam, located on the Guadalete river where it passes through Jerez de la Frontera (Spain).
Location of Jerez de la Frontera
GMV, which was founded in 1984, has wide experience in hi-tech sectors with a growing order book in all five continents. It has experienced an important technology transfer along its trajectory and nowadays the company focuses its efforts on two business lines: transport and telecommunication sectors and applications of information technologies.
The regional government detected a high cost of maintenance of the old measurement equipments along with high costs of transport and possible inconsistencies due to manual handling of the tools.
“El Portal” irrigation dam at Jerez de la Frontera, Spain
The main goals of the project were to reduce the costs of measurement and data network management as well as to avoid manual processing that may lead to inaccuracy. In the same way, the electrical consumption of the previous equipment had a handicap to solve, together with the fact that this location usually suffers from frequent acts of vandalism against power lines, automatically ceasing the normal functioning of the monitoring system.
GMV and the regional government of Andalusia trusted Libelium technology to deploy this project to monitor different water quality parameters in an irrigation dam on the Guadalete river, close to Jerez de la Frontera.
Installation of the Waspmote Plug & Sense Smart Water sensors
Two measuring nodes Waspmote Plug & Sense! Smart Water were installed in the location to measure levels of temperature, pH, dissolved oxygen and conductivity every 30 minutes. Sigfox was the protocol chosen by GMV, with a view to enlarge the deployment in the future.
Waspmote Plug & Sense! Smart Water at “El Portal” dam
The data collected by the sensors is sent to the proprietary software SEMS (Smart Environment Monitor System), which allows monitoring of any kind of parameter, managing sensors, executing custom queries, managing users, reporting alarms and many other operations.
Diagram of GMV project
This platform gives the irrigators access to real-time information on water quality to help decision-making in aspects such as the opening and closing of gates or the hours when water quality is higher. Additionally, manual collection is not necessary anymore so access to the information is now easier and quicker.
GMV highlights the adaptability of the Waspmote wireless sensor platform to any need and any environment along to the interoperability and compatibility with Sigfox and the low electrical consumption, which were ideal for the challenge they had to face.
GMV SEMS dashboard for the Andalusian Government
This new water quality monitoring system meant savings of around 50% in development time. The company is currently carrying out a technical report to present the results obtained after controlling the deployment in terms of sensorisation cost savings.
The Andalusian government (Junta de Andalucía in […]
Read more here:: www.m2mnow.biz/feed/
Syncsort, a global provider in Big Iron to Big Data solutions, announced the results from its fourth annual Big Data survey. Results revealed the top use cases and challenges faced by organisations as they progress their modern data architecture and data lake initiatives, and see significant benefits from Hadoop/Spark, with nearly 60% citing both increased productivity across the organisation and improved efficiency to reduce costs as their biggest gains.
Compared with last year’s survey, the most dramatic increase in reported benefits was higher revenue and accelerated growth, which 55% named as a benefit this year compared with 37% last year. And though organisations are using Big Data insights in more sophisticated ways to improve revenue and customer service, they continue to face some of the same challenges reported in past years, including keeping up with rapidly changing technologies and tools.
The survey also found that Hadoop and Spark, which had high interest but low adoption at the time Syncsort launched the survey in 2014, are now in production or test at 70% of responding organisations. Specifically, this year, more than 40% of respondents say they are in production with either Hadoop or Spark, and 30% say they are engaged in a proof of concept or pilot program.
Based on the survey results, the five key trends in Data Lake initiatives that organisations need to monitor in 2018 include:
1. The composition of the data lake shifts. Traditional sources remain most popular for filling the data lake. Relational Database Management Systems (RDBMS) were chosen as the top source at 69%, up from 61% last year, surpassing enterprise data warehouses (EDW) at 63%. But newer sources grew, with NoSQL databases identified by 46% of respondents compared to 35% last year. Cloud repositories are also gaining strength as a data source (accounting for 40%) as more organisations leverage the cloud as a deployment platform.
2. Legacy platforms continue to play a significant role. Data from legacy platforms (such as the mainframe and IBM i) also make major contributions to the data lake: over 97% of respondents with mainframe believe it’s valuable to access and integrate mainframe data into the data lake for real-time analytics – a 27% increase over last year. Over 90% of those with IBM i believe it’s valuable to access and integrate that data into the data lake, not surprising as leaving behind decades of valuable data stored on these systems would seriously hamper their companies’ analytics.
3. Data quality and regulatory compliance challenges are top of mind. While the skills shortage had been ranked the top challenge for three consecutive years, this year it fell to number two, replaced by mounting concerns over improving the quality of data in the data lake. Indeed, 40% said data quality was a significant struggle for their organisation, likely a result of expanded use of data lakes driving an emphasis on improving data quality. But the survey also showed not everyone is making the connection between data quality and ROI. 60% of Financial Services and Insurance professionals said ensuring data quality is a top […]
Read more here:: www.m2mnow.biz/feed/
The level of industry confidence in Wi-Fi investment is at its highest-ever, according to the Wireless Broadband Alliance’s Annual Industry Report for 2017. As the wireless industry becomes crucial to delivering high quality, high speed, low latency connectivity, the new global study has revealed that over 80% of those surveyed feel as or more confident than they did a year ago. And when looking at unlicensed spectrum more broadly, almost half (47%) feel more confident.
The report, compiled by Maravedis on behalf of the WBA, comes at a significant time for the wireless ecosystem. There is a growing consensus that the success of 5G, unlike previous generations of standards, will rely on the convergence of multiple Radio Access Technologies (RATs) in unlicensed, shared and licensed spectrum, with Wi-Fi playing the central role.
Developments in the latest Wi-Fi standards, including 802.11ax, will improve Wi-Fi performance and capabilities to support 5G use cases – including high density networks, extreme Mobile Broadband (eMBB) and aggregation of multiple frequencies, amongst others.
“Wireless use cases are expanding rapidly, enabled by new technologies and spectrum in the unlicensed and shared bands”, said Adlane Fellah, senior analyst, Maravedis. “These innovations are laying the foundations for the 5G era, in which Wi-Fi will play a central role.”
As industry attention moves toward monetising Wi-Fi, the study also highlights the drivers of additional traffic over the network, as well as use cases with initial revenue potential in different verticals. In this year’s survey, the services most important to monetisation strategies for 2018 according to respondents included location based services (37.5%), roaming (33%) and marketing analytics (almost 33%).
The three Wi-Fi use cases tipped to drive near term revenue potential include: extending internet access and media to a full smart home, richer and more efficient enterprise services driven by cloud managed networks and security, and expansion of the Wi-Fi roaming model.
The report also highlights the power of Wi-Fi, along with Low Power Wide Area Network (LPWAN) technologies, to provide a rapid and cost effective deployment of various Internet of Things (IoT) applications, including the deployment of smart cities. But interoperability between different technologies, independent certification of devices and equipment and collaboration between different city stakeholders were identified as areas that connected city ecosystems must urgently address.
The WBA’s Connected City Advisory Board produced the second version of its Blueprint in November 2017, which intends to help cities develop their plans to become smart and emphasises the need to bring together the complex value chain of city stakeholders.
Also uncovered in the report is the rising adoption of the WBA’s Next Generation Hotspot (NGH) to support seamless authentication and multi-RAT access. The survey found that NGH had crossed the chasm with 23% of respondents confirming its implementation, and 30% planning to by the end of the year.
“As Wi-Fi continues to evolve, enabled by new technologies, it has the ability to support new connected services and use cases in the 5G era across all segments including, Carriers & Service Providers, Connected Cities and Enterprise & Hospitality ecosystems”, said Shrikant […]
The post Industry confidence in investing in Wi-Fi reaches record levels appeared first on IoT Now – How to run an IoT enabled business.
Read more here:: www.m2mnow.biz/feed/
Powercast Corporation, the provider of radio-frequency (RF)-based long-range power-over-distance wireless charging technology, announced that it will unveil at CES its FCC-approved (Part 15, FCC ID: YESTX91503) and ISED-approved (Canada IC: 8985A-TX91503) three-watt PowerSpot transmitter which works in the far field(up to 80 feet) for over-the-air charging of multiple devices – no charging mats or direct line of sight needed.
Powercast used the experience it gained powering industrial and commercial devices with its initial Powercaster® transmitter (FCC and ISED approved in 2010) to develop the new smaller, smarter and less expensive PowerSpot transmitter specifically for the consumer market. The PowerSpot is the industry’s first long-range, far-field, power-over-distance wireless recharging transmitter for consumer devices to gain FCC and ISED approval.
How Powercast’s patented remote wireless charging technology works
Creating a coverage area like Wi-Fi, a Powercast transmitter automatically charges enabled devices when within range. The transmitter uses the 915-MHz ISM band to send RF energy to a tiny Powercast receiver chip embedded in a device, which converts it to direct current (DC) to directly power or recharge that device’s batteries.
Powercast will begin production of its standalone PowerSpot charger now that it is FCC approved and is also offering a PowerSpot subassembly that consumer goods manufacturers can integrate into their own products.
Consider lamps, appliances, set-top boxes, gaming systems, computer monitors, furniture or vehicle dashboards that become “PowerSpots” able to charge multiple enabled devices around them.
Powercast is in discussions with several manufacturers, and has inked deals with two household names, since releasing a wireless power development kit in early 2017 containing the PowerSpot subassembly.
“Consumer electronics manufacturers can now confidently build our FCC-approved technology into their wireless charging ecosystems, and offer their customers convenient far-field charging where devices charge over the air from a power source without needing direct contact, like inductive charging requires, or near direct contact, like magnetic resonance requires,” said Powercast’s COO/CTO Charles Greene, Ph.D.
The company’s vision is to enable long-range, true wireless charging where consumers simply place all Powercast-enabled devices for charging within range of a PowerSpot in their home or a public place.
“Others might be talking RF power possibilities, but we have consistently delivered far-field wireless power solutions that work, safely and responsibly, under FCC and other global standards providing power up to 80 feet,” said Greene. “Our robust technology has capabilities beyond today’s permitted standards, so our product releases will evolve as regulations do.”
The PowerSpot creates an overnight charging zone of up to 80 feet free of wires or charging mats
Enabled devices charge when in range, but don’t need direct line of sight to the PowerSpot. Powercast expects up to 30 devices left in the zone on a countertop or desktop overnight can charge by morning, sharing the transmitter’s three-watt (EIRP) power output. Charging rates will vary with distance, type and power consumption of a device.
TX91503 – PowerSpot Transmitter
Power-hungry, heavily used devices like game controllers, smart watches, fitness bands, hearing aids, ear buds, or headphones charge best up to two feet away; keyboards and mice up to six feet away; TV […]
Read more here:: www.m2mnow.biz/feed/
Research from Databarracks, has revealed that 30% of organisations do not know how much of their IT budget is being spent on disaster recovery and backup services. This follows wider industry research finding that firms in Europe and North America spend 7% of their IT budget on backup and disaster recovery.
Data from Databarracks’ annual Data Health Check survey revealed a number of insights into organisational attitudes and approaches towards IT resiliency:
25% do not know what percentage of their IT budget should be allocated for disaster recovery and business continuity
Only 43% of organisations have tested their disaster recovery processes over the past 12 months
29% of respondents answered “less than £1,000” when asked ‘how much annually does your organisation spend on backup/DR solutions’
Peter Groucutt, managing director of Databarracks comments: “It’s often difficult for IT to secure investment for resiliency because it’s not seen as a particularly dynamic or sexy investment that will add value like a new customer-facing system. But we all know we need to invest in resilience to ensure our continued operation.”
Groucutt continued: “Disaster recovery and backup spending is protection against the risks of user downtime, data loss, and business interruption, but often knowing how much investment needed is difficult to gauge. Every organisation knows it needs some level of protection, but determining the extent, and the appropriate financial investment is an ongoing challenge, as evidence from our research highlights.
“The analogy we often use is the police recommendation for protecting a bike. They suggest spending at least 10% of the value of the bike on the lock to secure it, which makes sense – if you put a cheap lock on an expensive bike it will be quickly stolen. The one caveat we would add to that analogy is that the amount you spend should also relate to your risk profile.
If you frequently lock your bike up at a train station with known bike thieves you would be wise to invest more in your lock. Similarly, if your premises are in a location susceptible to flooding or terrorist events it is sensible to invest more in your resilience.
Groucutt concludes that to help secure the funds needed to improve IT resilience, senior management must understand what the true cost of IT downtime would mean for an organisation: “There isn’t a simple answer to say ‘invest X% and you’ll remain safe,’ that works for all businesses, but that doesn’t mean that it is very difficult to budget for continuity.
“As with other aspects of continuity planning, if you have identified the risks to your business and analysed the impact incidents will have on your operations, it then becomes clear what mitigation strategies you need to put in place. It will always be difficult to secure investment for IT resilience if you don’t have a clear picture of what impact downtime will have.
Presenting a downtime cost – considering both the tangible, as well as the intangible or hidden costs – immediately puts the cost of investment into context, and helps strengthen the case for improving IT resilience.”
For more information […]
Read more here:: www.m2mnow.biz/feed/
It’s been more than a decade since we first heard the phrase “data is the new oil.” But while this idea may well define the next generation of business, there’s important context surrounding it that often gets overlooked.
Since making this statement, many others have echoed these same words. What we now have come to realise is that data is a commodity, a raw material, and it’s only valuable when it can be turned into intelligence. Without the right tools to refine it, it’s just a bunch of ones and zeros. Study after study shows that while most enterprises understand the importance of data, they continue to struggle to draw real value from it, says Matt Mills, CEO and Board Member at MapR Technologies.
Amongst the most powerful and largest tech titans in the world, the idea of data turned into intelligence is gospel. It’s why companies like Apple, Amazon, Google, Facebook, and Microsoft dominate the charts these days. They not only know the value of data, but also how to transform it from raw insight into a competitive advantage.
Unlike these companies, many organisations today are using 30-year old technologies to “refine” their data and are frustrated with the little progress they are making. The simple fact is that older technologies are often too fragile and simply aren’t built for the diversity or the sheer volume of data today. The fact of the matter is this is just the beginning. Many believe that from now on, data and the digital universe will double in size every two years.
Successful data-driven companies in the early stages of their digital transformation journeys are choosing a modern data platform that is both optimised for performance today and provides the speed, scale and reliability that are required for next-generation intelligent solutions.
The modern data platform has 10 key characteristics:
A single platform that performs analytics and applications at once
Manages all data from big to small, structured and unstructured, tables, streams, or files – all data types from any source
A database that runs rich data-intensive applications and in-place analytics
Global cloud data fabric that brings together all data from across every cloud to ingest, store, manage, process, apply, and analyse data as it happens
Diverse compute engines to take advantage of analytics, machine learning and artificial intelligence
Delivers cloud economics by operating on any and every cloud of your choice, public or private
No lock-in, supporting open APIs
DataOps ready to champion the new process to create and deploy new, intelligent modern applications, products and services
Trusted with security built from the ground up
Streaming and edge first for all data-in-motion from any data source as data happens and enabling microservices natively
Many software products today can handle some aspects of modern day data platforms, but few if any can actually deliver on all of these requirements. This is where most companies get into trouble. They try to extend the software to do things it was never intended to do. These limitations are a key reason why many companies never reach their goals and objectives with data.
Here at MapR we have built the premier data platform for today — and tomorrow’s — leading enterprises. […]
Read more here:: www.m2mnow.biz/feed/
This is the eighth year we measure IPv6 on the Christmas Goat. And with the crazy climate we have to live in now where there is no snow on the goat or ground… (If you want to remember the crazy snow storm from 1998, watch this.) But IPv6 is doing better than the climate this year. This year we increased from 27% 2016 to 40% 2017. In Sweden Tele2, Tre and Comhem are still the only major ISPs with IPv6 enabled. Tele2 (with IPv6 since ~three years) and Tre is mostly mobile operators, and Comhem has enabled IPv6 in their Docsis network.
But this three only get up to 10% by themselves and the other 30% is mostly from outside Sweden.
Values from previous measurements:
You see what the line is approaching, next year we easily break the 50% barrier! 🙂
In Sweden the IPv6 traffic increased from 4% to 6% this year according to Google measurements. That’s a great increase but we still have much to do. I have done some articles (in Swedish but with Google translate link at the bottom) about our own problem the first of which is here (Why Internet in Sweden is broken).
And Google’s worldwide measurements show an increase from ~15% 2016 to ~23% in 2017.
Have a happy and good IPv6 year during 2018 and I hope I can do the measurement 2018 too!
Written by Torbjörn Eklöv, CTO, Senior Network Architect, DNSSEC/IPv6
Read more here:: www.circleid.com/rss/topics/ipv6
The Internet of Things is no longer the future. It’s very much the present and the market is growing at an exponential rate, says Robin Kent, director of European operations, Adax. It’s set to quadruple in size, growing from US$900 billion (€758.25 billion) in 2014 to $4.3 trillion (€3.62 trillion) by 2024, with more than 30 billion connected devices in use.
Approximately 7.5 billion of those devices will access a cellular IoT core network, which is three times greater than the number of LTE-connected smartphones today. Service providers have faced waves of game changing technologies influenced by end user demand over the past 20 years, including the rise of SMS and the explosion of Over-The-Top (OTT) applications on smartphones, and IoT is in the same bracket, if not out on its own as the future of the technology world.
If media and analyst predictions on its rate of growth are proved true, the telecoms industry faces huge pressure. It will play the pivotal role in anchoring connections from device to device and the core. But if connections are lost, what are the consequences? And how do service providers play their part in ensuring they’re not faced with dealing with the impact of potential downtime?
Driving the business model development
With predictions that total M2M revenue opportunity is forecast to reach $1.6 trillion (€1.35 trillion) in 2024, up from $500 billion (€421.25 billion) in 2014 (an annual growth of 12%), it’s clear why so many industries are vying for their piece of the pie. Telecoms has been identified as the glue which will hold the connected world together so it’s vital that any teething problems in these early years are identified and resolved.
Just last month, we saw Vodafone delve into the consumer side of IoT with the launch of its new “V by Vodafone” bundle, whereby consumers are charged for the number of connected devices they add to their monthly plan. Consumers are one of many drivers behind the rise of IoT and it won’t be long before other operators follow in Vodafone’s footsteps.
Latency is not an option
For the true benefits of IoT to be recognised, highly reliable connectivity is key. Much in the same way networks deal with voice calls, device connection must be immediate and on hand whenever called upon, as well as being reliable and strong to avoid latency. Devices vary depending on their use case, throughput requirements, power consumption and the service requirements across different IoT applications. These applications can be categorised by two factors: data throughput and connectivity.
Connectivity is key for the wide range of applications that have a low tolerance of latency. These include; location-based marketing and advertising; industrial robotics and environmental control; smart home control; augmented and virtual reality; vehicle control and telematics; and remote personal healthcare, remote surgery and some wearables. It’s vitally important that operators have the correct tools in place that address latency control, reliability improvement and authentication for such applications.
Keeping a close connection
It’s paramount for operators to have an understanding of how to build a […]
Read more here:: www.m2mnow.biz/feed/
The justice system is known for many things, but efficiency is not one of them. Neither is being up-to-speed with technology. One joke goes that the unofficial IT slogan of the courts is, “Yesterday’s technology, tomorrow!”
Into this space comes LegalThings, an Amsterdam-based digital contracts company that’s aiming to update how those accused of a crime move through the justice system by making the law accessible while making judicial record-keeping more open and secure.
After winning a “blockathon” competition in September hosted by the Dutch Ministry of Justice and Security, LegalThings began a pilot project with the Public Prosecution Service of the Netherlands, known as the “Openbaar Ministerie,” or OM, in Dutch. The project aims to build a system to process low-level criminal offenders quickly and with more transparency. If successful, it could be a huge time- and money-saving enterprise for the government.
“What you see now [in the justice system] is there is a lot of procedures, and those procedures are important to create a fair legal system, but they’re also really labor-intensive,” said Arnold Daniels, a co-founder of LegalThings and its chief software engineer. “What we’re trying to do is create an alternative to that.”
How might that work in practice? Imagine someone nabbed for possession of a small amount of illicit drugs, a crime that, in the Netherlands, can carry a fine of a few hundred euros. There are a number of parties involved in processing such a law enforcement action: the police who catch the alleged offender, the forensics expert that examines the drugs, and the OM.
Depending on whether the forensic expert is on-site to test the drugs, processing such an enforcement action can take anywhere from several hours to a couple days, said Sanne Giphart, innovation manager at OM. While some record-keeping systems have been made digital, that’s an ongoing process, Giphart explained. Things can move slowly.
By contrast, with the LegalThings application, the accused can get an explanation of the relevant law, choose whether to be represented by counsel, and agree to pay the relevant fine—all on their smartphone. All told, the actual processing of the offender takes about 30 minutes, and every step of the exchange is recorded, time-stamped, and made unchangeable using cryptography to ensure records can’t be fudged.
So far, OM, which is comparable to a mashup of the Department of Justice and local district attorneys in the U.S., has only experimented with the technology on “dummy data” involving a drug offense and a domestic violence offense, Giphart said. “The next step is to let people get familiar with this type of technology within the [OM] and then hopefully we can implement on one stream of cases.”
The challenges to implementing such a system are not purely technological. It also will likely require some changes in both public and institutional attitudes toward judicial record-keeping, said Daniels. “With this system, there’s really no backsies,” he explained. “You can correct it, but you can always see your initial action.”
Unlike other blockchain systems that use a publicly distributed ledger, the LegalThings project with OM allows […]
Read more here:: www.m2mnow.biz/feed/
UK drone technology and service provider uVue has taken delivery of what is believed to be the world’s first ever production hydrogen drone – the MMC Hydrone 1550 – enabling a new era for drone technology services in the UK to begin.
According to uVue, the MMC Hydrone 1500 is a huge breakthrough for the drone services industry because it is now a viable alternative for many services providers using manned helicopters and light aircraft. As a comparison, current battery powered drones average a 30 minute flight time with the voltage constantly dropping during flight. This compares to the MMC Hydrone 1500’s equivalent of three hours with a consistent voltage throughout.
The ultimate cost savings for many industries choosing an MMC Hydrone 1500 solution are therefore transformational. Examples of application industries that will benefit from this ground-breaking drone include precision agriculture, security & surveillance, traffic monitoring, emergency services and construction, to name a few.
Russ Delaney, director of Tech Ops at uVue, an ex-British Army helicopter instructor and drone pilot with more than 20 years’ experience of unmanned aerial vehicles (UAVs) comments, “After extensive research into drone technology and potential solutions, we are delighted to be the first company to have been appointed to distribute and licence this incredible machine in the UK.
Aside from its robust performance, I am particularly impressed by the endurance of the MMC Hydrone 1500 – up to three hours flight time with a 2kg payload – making this a genuine game changer in our industry. The endurance of this particular Hydrone is so strong that it offers a real alternative to full-sized rotary aircraft.“
The key focus for uVue during the introduction of the Hydrone 1550 was that all of the onboard systems met the very high safety standards required to operate the drone safely within the UK commercial drone airspace, whilst also remaining within the regulations set by the Civil Aviation Authority (CAA).
uVue is the exclusive licensed distributor for all MMC Drones and the MMC Hydrone 1550 is available to be bought or hired as a service proposition from uVue here.
Click here to see a video of the drone in flight.
Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow
Read more here:: www.m2mnow.biz/feed/