LOS ANGELES – 15 February 2019 –The Internet Corporation for Assigned Names and Numbers (ICANN) today announced that it is aware of several recent public reports regarding malicious activity targeting the Domain Name System (DNS). We have no indication that any ICANN organization systems have been compromised, and we are working with relevant community members to investigate reports of attacks against top-level domains (TLDs). For some reporting on this issue, please refer to these sources:
- United States Department of Homeland Security (DHS) and Cybersecurity and Infrastructure Security Agency (CISA) Emergency Directive 19-01: “Mitigate DNS Internet Tampering”, 22 January 2019.
- “Why CISA Issued our first Emergency Directive”, United States DHS CISA blog, 24 January 2019.
- “Global DNS Hijacking Campaign: DNS Record Manipulation at Scale”, FireEye, 9 January 2019.
- “Widespread DNS Hijacking Activity Targets Multiple Sectors”, Crowdstrike blog, 25 January 2019.
- “Statement on man-in-the-middle attack against Netnod”, Netnod statement, 5 February 2019.
- “Revisiting How Registrants Can Reduce the Threat of Domain Hijacking”, Verisign blog, 11 February 2019.
- “.nl not affected by global domain hijacking campaign”, Stichting Internet Domeinregistratie Nederland blog, 15 February 2019.
ICANN believes it is essential that members of the domain name industry, registries, registrars, resellers, and related others, take immediate proactive and precautionary measures, including implementing security best practices, to protect their systems, their customers’ systems and information reachable via the DNS.
We trust that DNS industry actors are already taking strong security precautions in your business. However, here is a checklist to consider.
- Ensure all system security patches have been reviewed and have been applied;
- Review log files for unauthorized access to systems, especially administrator access;
- Review internal controls over administrator (“root”) access;
- Verify integrity of every DNS record, and the change history of those records;
- Enforce sufficient password complexity, especially length of password;
- Ensure that passwords are not shared with other users;
- Ensure that passwords are never stored or transmitted in clear text;
- Enforce regular and periodic password changes;
- Enforce a password lockout policy;
- Ensure that DNS zone records are DNSSEC signed and your DNS resolvers are performing DNSSEC validation;
- Ideally ensure multi-factor authentication is enabled to all systems, especially for administrator access; and
- Ideally ensure your email domain has a DMARC policy with SPF and/or DKIM and that you enforce such policies provided by other domains on your email system.
The Security and Stability Advisory Committee (SSAC) previously published advice and information on security best practices relevant to this threat:
ICANN strives to be a trusted partner in the multistakeholder community, and engage in collaborative efforts to ensure the security, stability and resiliency of the Internet’s global identifier systems. For more information on ICANN’s role in the security, stability and resiliency of the Internet’s identifier systems, visit https://www.icann.org/octo-ssr.
The ICANN community will continue the discussion on this critical topic at its upcoming ICANN64 meeting in Kobe. In addition, ICANN org is available to provide consultation on security best practices by emailing email@example.com.
ICANN’s mission is to help ensure a stable, secure, and unified global Internet. To reach another person on the Internet, you need to type an address – a name or a number – into your computer or other device. That address must be unique so computers know where to find each other. ICANN helps coordinate and support these unique identifiers across the world. ICANN was formed in 1998 as a not-for-profit public-benefit corporation with a community of participants from all over the world.
Read more here:: www.icann.org/news.rss
The SWOT Guide To Blockchain Part 6
The SWOT guide to blockchain is a guide in 6 parts, where both the opportunities and challenges of blockchain are considered. Blockchain has the potential to be groundbreaking, offering opportunities and better solutions for a range of situations and industries worldwide. In the sixth part of this guide, we analyze the most important weakness of blockchain technology in our opinion: its consumption of energy. We also demonstrate how such a weakness can be transformed into a strength, if innovation is triggered in the blockchain community, towards finding sustainable and ecological alternatives of blockchain solutions.
By Maria Fonseca and Paula Newton
Bitcoin And The Environment
Increasingly, concerns about blockchain’s impact on the environment are rising to the fore. In 2018 it was reported that bitcoin uses the same amount of carbon dioxide in a year, as one million flights crossing the Atlantic. This cannot be ignored. The use of electricity to drive bitcoin is tremendous. Further statistics have emerged that have accentuated the point. For example, it was argued that in one month alone, the use of electricity by the bitcoin network was greater than that used by the whole of the Republic of Ireland. Since that time (November 2017) the use of electricity by bitcoin has only grown further. For anyone with even a moderate passing interest in the environment, this is of concern.
The whole bitcoin system is built around the use of electricity. Mining, using electricity, needs to occur for the system to operate. When miners are able to mine faster and more effectively – using more electricity through more powerful machines – the higher the chances re that a miner will get the biggest reward. Everyone is motivated to use more electricity to gain the highest rewards. The bigger the system gets, the more electricity is burned to support it. Estimates show that if the price of bitcoin rose to $50,000, electricity consumption would increase by ten times – which is clearly tremendous. Some believe this is not a major concern since bitcoin will not achieve such a value (though this is arguable) and due to the fact that it is likely that technology will be developed that allows mining that is more energy efficient. Indeed, mining computers have already increased in efficiency over time. Yet it is impossible to rule out ongoing and continual increases in demand for energy use, making blockchain an environmental concern for many.
But it isn’t all bad news in so far as the environment is concerned. Other industry analysts suggest that green cryptocurrencies could have a part to play. While to date, buying environmentally friendly items has generally been equated to paying more for them, changes are afoot with the rise of green cryptocurrencies. It is thought that such cryptocurrencies will be beneficial in terms of offering benefits for people making greener purchases, as well as driving innovation. The way that green cryptocurrencies work is that blockchain has the ability to track and monitor environmental performance by business or individuals. This can be saved and embedded into the system, and those that are more effective in this regard can be rewarded. On the other hand, consumers will be able to have increased confidence that green really means green.
2019: And Still Waiting for Green Cryptocurrencies
It was predicted that in 2018 green cryptocurrencies would start to have their day, based on environmental data built into blockchain. Energy companies were in some cases carrying out pilots for peer-to-peer energy transactions and platforms, used for trading. However, some benefits of energy savings have been found to lead to a so-called “rebound effect” where the benefits gleaned are offset by the fact that environmentally unfriendly behaviour occurs with the savings made that would otherwise be spent on energy. To counteract this, green cryptocurrencies could be built in such a way that would ensure that the benefits gained could only be offset against payments for green products and services.
While these innovations are welcomed and likely to be very positive in terms of environmental benefits, this does not change the fact that cryptocurrencies and mining in the way that the system is structured at the moment is damaging to the environment. It is not clear the extent to which green cryptocurrencies would in themselves use electricity that would offset the benefits gained from them. Overall, further innovation is required to ensure that mining technology improves to reduce the gigantic electricity used through undertaking these processes.
The SWOT Guide To Blockchain Part 6 Intelligenthq
In What Ways Could Blockchain Support the Environment?
Blockchain, is more then cryptocurrencies though. It is also a way to develop softwares that are safer and theoretically distributed, as we have seen in other sections of this guide. How then, could Blockchain tech support the environment in. FutureThinkers have compiled the following practical examples that give readers a better picture of practical ways through which this technology could be used:
- Blockchain can be used to track environmental compliance and the impact of Treaties — decreasing fraud and manipulation.
- Donations to charities can be tracked to ensure that they are being attributed efficiently and as planned.
- Products can be tracked from origin to source. This can help reduce carbon footprints, increase ethical accountability and reduce unsustainable practices.
- Schemes such as recycling can be incentivised by offering token rewards to participants.
- Peer to peer localised energy distribution is possible, rather than the current system of a centralised hub.
- Blockchain can also be used to track the carbon footprint of products, which can then determine the amount of carbon tax to be charged.
Over this comprehensive SWOT analyses of Blockchain, we have described the challenges, strengths, opportunities, and weaknesses of Blockchain. One thing is certain, blockchain tech is here to stay. We live in a world increasingly digitized and interconnected, and with the rise of the Internet of Things, blockchain tech, due to its focus on trust, accountability and its distributed character, might be the perfect technology to structure the global digital networks of the future.
One cannot forget though, that there are also various problems and weaknesses that need to be tackled. And one cannot either, just take at face value, that this technology is better and fairer, just because it’s disruptive.
Solutions will certainly appear along the way, as more people experiment with putting into practice its theoretical concepts. Our hope is that blockchain will have positive implications for the implementation of a fairer and more sustainable circular economy, that will better tackle environmental issues, inequality and other problems of our current broken system. In order for that to happen, calls for, from all of us, a high doses of attention to detail, and critical reasoning, while examining this technology. Only by actively speaking out what is going astray from its initial promises, and acting quickly to correct and prevent problems, can we build a system, eventually built with blockchain, that will provide us with a more connected and beautiful world.
Read more here:: www.intelligenthq.com/feed/
By Ryan Whitwam
Google announced the nebulous Android Things platform several years ago to power various smart devices as part of the so-called Internet of Things (IoT). Now, Google says it’s rethinking the scale of Android Things.
The post Google Decides Android Things Will Only Be for Smart Speakers and Displays appeared first on ExtremeTech.
Read more here:: www.extremetech.com/feed
Last year, at Re:invent, Amazon AWS launched Outpost and finally validated the concept of hybrid-cloud. Not that it was really necessary, but still…
At the same time, what was once defined as cloud-first strategy (with the idea of starting every new initiative on the cloud often with a single service provider), today is evolving into a multi-cloud strategy. This new strategy is based on a broad spectrum of possibilities that range from deployments on public clouds to on-premises infrastructures.
Purchasing everything from a single service provider is very easy and solves numerous issues but, in the end, this means accepting a lock-in that doesn’t pay off in the long run. Last month I was speaking with the IT director of a large manufacturing company in Italy who described how over the last few years his company had enthusiastically embraced one of the major cloud providers for almost every critical company project. He reported that the strategy had resulted in an IT budget out of control, even when taking into account new initiatives like IoT projects. The company’s main goal for 2019 is to find a way to regain control by repatriating some applications, building a multi-cloud strategy, and avoiding past mistakes like going “all in” on a single provider.
There Is Multi-Cloud and Multi-Cloud
My recommendation to them was not to merely select a different provider for every project but to work on a solution that would abstract applications and services from the infrastructure. Meaning that you can buy a service from a provider, but you can also decide to go for raw compute power and storage and build your own service instead. This service will be optimized for your needs and will be easy to replicate and migrate on different clouds.
Let’s make an example here. You can have access to a NoSQL database from your provider of choice, or you can decide to build your NoSQL DB service starting from products which are available in the market. The first is easier to manage, whereas the latter is more flexible and less expensive. Containers and Kubernetes can make it easier to deploy, manage and migrate from cloud to cloud.
Kubernetes is now available from all major providers in various forms. The core is the same, and it is pretty easy to migrate from one platform to the other. And once into containers, you’ll find loads of prepared images and others that can be prepared for every need.
Storage, as always, is a little bit more complicated than compute. Data has gravity and, as such, is difficult to move; but there are a few tools that come in handy when you plan for multi-cloud.
Block storage is the easiest to move. It is usually smaller in size, and now there are several tools that can help protect, manage and migrate it — both at the application and infrastructure levels. There are plenty of solutions. In fact, almost every vendor now offers a virtual version of its storage appliances that run on the cloud, as well as other tools to facilitate the migration between clouds and on-premises infrastructures. Think about Pure Storage or NetApp, just to name a couple. It’s even easier at the application level. Going back to the NoSQL mentioned earlier, solutions like Rubrik DatosIO or Imanis Data can help with migrations and data management.
Files and objects stores are significantly bigger and, if you do not plan in advance, it could get a bit complicated (but is still feasible). Start by working with standard protocols and APIs. Those who choose S3 API for object storage needs will find it very easy to select a compatible storage system both on the cloud and for on-premises infrastructures. At the same time, many interesting products now allow you to access and move data transparently across several repositories (the list is getting longer by the day but, just to give you an idea, take a look at HammerSpace, Scality Zenko, RedHat Noobaa, and SwiftStack 1Space). I recently wrote a report for GigaOm about this topic and you can find more here.
The same goes for other solutions. Why would you stay with a single cloud storage backend when you can have multiple ones, get the best out of them, maintain control over data and manage it on a single overlaying platform that hides complexity and optimizes data placement through policies? Take a look at what Cohesity is doing to get an idea of what I’m saying here.
The Human Factor of Multi-Cloud
Regaining control of your infrastructure is good from the budget perspective and for the freedom of choice it provides in the long term. On the other hand, working more on the infrastructure side of things requires an investment in people and their skills. I’d put this as an advantage, but not everybody thinks this way.
In my personal opinion it is highly likely that a more skilled team will be able to make better choices, react quicker, and build optimized infrastructures which can give a positive impact to the competitiveness of the entire business but, on the other hand, if the organization is too small it is hard to find the right balance.
Closing the Circle
Amazon AWS, Microsoft Azure and Google Cloud are building formidable ecosystems and you can decide that it is ok for you to stick with only one of them. Perhaps your cloud bill is not that high and you can afford it anyway.
You can also decide that multi-cloud means multiple cloud silos, but that is a very bad strategy.
Alternatively, there are several options out there to build your Cloud 2.0 infrastructure and maintain control over the entire stack and data. True, it’s not the easiest path and neither the least expensive at the beginning, but it is the one that will probably pay off the most in the long term and will increase the agility and level of competitiveness of your infrastructure. This March, on the 26th, I will be co-hosting a GigaOm’s webinar sponsored by Wasabi on this topic, and there is an interview I recorded not too long ago with Zachary Smith (CEO of Packet) about new ways to think about cloud infrastructures. it is worth a listen if you are interested in knowing more about a different approach to cloud and multi-cloud.
Originally posted on Juku.it
Read more here:: gigaom.com/feed/
By Ron Amadeo
Android Things, Google’s stripped-down version of Android named for its focus on the “Internet of Things” (IoT), is now no longer focused on IoT. A post on the Android Developers Blog announced the pivot, saying, “Given the successes we have seen with our partners in smart speakers and smart displays, we are refocusing Android Things as a platform for OEM partners to build devices in those categories moving forward.”
Originally, Android Things was Google’s stripped-down version of Android for everything smaller than a smartphone or smartwatch. The goal was to have the OS be the IoT version of Android, but rather than the skinnable, open source version of Android that exists on phones, Android Things is a “managed platform”—a hands-off OS with a centralized, Google-managed update system. Just like Windows, manufacturers would load an untouched version of the OS and be restricted to the app layer of the software package. Today, legions of IoT devices are out there running random operating systems with basically no plan to keep up with security vulnerabilities, and the result is a security nightmare. The wider Android ecosystem doesn’t have a great reputation when it comes to security, but Android Things updates are completely managed by Google via a centralized update system, and just like a Pixel phone, devices running Things would have been some of the most up-to-date and secure devices available.
Seeing Android Things undergo a major pivot now is pretty strange. The OS has just survived a lengthy initial development cycle (originally, Android Things started out as a rebrand of “Project Brillo“), and it only hit version 1.0 nine months ago. The first consumer products with Android Things, third-party smart displays like the Lenovo Smart Display, only launched in July.
Read more here:: feeds.arstechnica.com/arstechnica/index?format=xml