News and Updates

Lawsuit filed to recover EQN.com domain name

Domain Name Wire - Tue, 2018-10-30 19:16

Domain was stolen in 2016, according to lawsuit.

A California company has filed a lawsuit (pdf) to recover the domain name EQN.com, which it says was stolen from its Enom account.

Blackshore Properties, Inc. of San Marino, California acquired EQN.com in a NameJet auction in 2011 for $4,950.

According to DomainTools historical Whois records, the domain name was transferred to a Chinese registrant at domain registrar 22.cn between May 5, 2016 and September 23, 2016. The lawsuit alleges that this transfer was theft.

The suit was filed in U.S. District Court in Virginia where .com registry Verisign is located. Assuming no one shows up to defend the in rem lawsuit, the just will likely enter an order of default judgment and order Verisign to transfer the domain name.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. Lawsuit filed to recover stolen OnlineMBA.com domain name
  2. Lawsuit filed to recover 6213.com, an allegedly stolen domain name
  3. Former SNN.com owner goes to court to recover domain name
Categories: News and Updates

The economics of “forever” domain registrations

Domain Name Wire - Tue, 2018-10-30 18:02

Should the domain name industry introduce perpetual domain registrations?

Consumers like predictability. They dislike price increases on their cable bill, healthcare premiums and mortgage.

Companies like cash in the bank. They can do a lot with it: invest, buy back shares, etc.

So the idea of a “forever” domain registration in which one payment is made and you own the domain for life could appeal to an end user. It also appeals to the domain industry because it can use the cash and not have to worry about renewals.

Last week at the ICANN meeting in Barcelona, Epik founder Rob Monster explained to me what he’s trying to do with the concept of perpetual domain registrations. He’s offering these registrations on Epik for $420. Think of it as a proof of concept to show there’s demand for such a product.

What really needs to happen for domain registrars and registries to start offering a product like this is for the entire supply chain to be on board. The registry must offer a one-time price and ICANN must as well. Would ICANN accept a one-time $10 payment to cover all future fees it charges for domain registrations? Would Verisign accept a one-time $200 payment to renew a .com domain in perpetuity?

Epik’s Forever registrations

As it stands right now, Epik’s forever registration starts with a 10-year registration, the maximum allowed by the registries. After considering the registry fee and ICANN fees, this leaves Epik with $326 in working capital. It then renews the domain for another year each year.

Depending on what rate you consider your cost of capital, this gives Epik a lot of room. Monster figures that as long as the renewal price of a .com is $19.50 or less per year, the .com is self-financing into perpetuity without even tapping into the $326 “deposit”.

But what will the market be like in 10, 20 or 30 years? There’s a lot of uncertainty. Domains as we know them have ony been around for about 30 years.

Let’s say the NTIA agrees to let Verisign increase prices 5% a year starting next year. Assume ICANN does the same on its fees. It will still take nearly 20 years for the wholesale price of .com to top 20. 100 years from now the wholesale would be around $950 per year.

Epik’s terms of service give the company an out, albeit one that is friendly to registrants. Epik can cancel the registration at any time as long as it refunds 100% of the purchase price. It’s smart to have this exit clause given how much the market could change over time.

A panacea for domain companies and registrants?

I like the concept of perpetual registrations but I believe the entire industry needs to get on board. This is Monster’s goal. Prove demand to get people talking.

Public domain companies recognize revenue over time. So selling a perpetual registration for $420 today doesn’t let them recognize $420 in revenue. They’d have to recognize it over time…a long time.

But they can still invest the upfront cash and not have to worry about domain renewals. So it would be nice.

Will many consumers bite, though?

After all, people can already register domains for a decade at a time but few people do that. Consider the most recent Verisign registration report from June.

In June, there were 2,828,662 new .com registrations. 89% were for only one year. Fewer than 10,000 were for eight years or more.

6,525,072 .coms were renewed that month. Again, fewer than 10,000 were renewed for eight or more years.

To be fair, consumers might not see the advantage of paying in advance since .com prices have been steady for many years now. Also, it’s a 10-year registration, not lifetime. But there’s still a benefit to long registrations; you’re less likely to let a domain lapse.

Despite this, consider that many Fortune 500 companies only renew their primary domain one year at a time. Fedex.com expires next month!

It’s worth a conversation

I think it’s worth having a conversation about perpetual domain registrations. In order for it to go mainstream, all players in the supply chain must be contractually involved.

At a minimum, it might be time to talk about extending the 10-year maximum for registering and renewing domains.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

No related posts.

Categories: News and Updates

Ossification and Fragmentation: The Once and Future 'net

Domain industry news - Tue, 2018-10-30 17:20

Mostafa Ammar, out of Georgia Tech (not my alma mater, but many of my engineering family are alumni there), recently posted an interesting paper titled The Service-Infrastructure Cycle, Ossification, and the Fragmentation of the Internet. I have argued elsewhere that we are seeing the fragmentation of the global Internet into multiple smaller pieces, primarily based on the centralization of content hosting combined with the rational economic decisions of the large-scale hosting services. The paper in hand takes a slightly different path to reach the same conclusion.

The author begins by noting networks are designed to provide a set of services. Each design paradigm not only supports the services it was designed for but also allows for some headroom, which allows users to deploy new, unanticipated services. Over time, as newer services are deployed, the requirements on the network change enough that the network must be redesigned.

This cycle, the service-infrastructure cycle, relies on a well-known process of deploying something that is "good enough," which allows early feedback on what does and does not work, followed by quick refinement until the protocols and general design can support the services placed on the network. As an example, the author cites the deployment of unicast routing protocols. He marks the beginning of this process as 1962, when Prosser was first deployed, and then as 1995 when BGPv4 was deployed. Across this time routing protocols were invented, deployed, and revised rapidly. Since around 1995, however — a period of over 20 years at this point — routing has not changed all that much. So there were around 35 years of rapid development, followed by what is now over 20 years of stability in the routing realm.

Ossification, for those not familiar with the term, is a form of hardening. Petrified wood is an ossified form of wood. An interesting property of petrified wood is that is it fragile; if you pound a piece of "natural" wood with a hammer, it dents, but does not shatter. Petrified, or ossified, wood shatters, like glass.

Multicast routing is held up as an opposite example. Based on experience with unicast routing, the designers of multicast attempted to "anticipate" the use cases, such that early iterations were clumsy, and failed to attain the kinds of deployment required to get the cycle of infrastructure and services started. Hence multicast routing has largely failed. In other words, multicast ossified too soon; the cycle of experience and experiment was cut short by the designers trying to anticipate use cases, rather than allowing them to grow over time.

Some further examples might be:

  • IETF drafts and RFCs were once short, and used few technical terms, in the sense of a term defined explicitly within the context of the RFC or system. Today RFCs are veritable books, and require a small dictionary to read.
  • BGP security, which is mentioned by the author as a victim of ossification, is actually another example of early ossification destroying the experiment/enhancement cycle. Early on, a group of researchers devised the "perfect" BGP security system (which is actually by no means perfect — it causes as many security problems as it resolves), and refused to budge once "perfection" had been reached. For the last twenty years, BGP security has not notably improved; the cycle of trying and changing things has been stopped this entire time.

There are also weaknesses in this argument, as well. It can be argued that the reason for the failure of widespread multicast is because the content just wasn't there when multicast was first considered — in fact, that multicast content still is not what people really want. The first "killer app" for multicast was replacing broadcast television over the Internet. What has developed instead is video on demand; multicast is just not compelling when everyone is watching something different whenever they want to.

The solution to this problem is novel: break the Internet up. Or rather, allow it to break up. The creation of a single network from many networks was a major milestone in the world of networking, allowing the open creation of new applications. If the Internet were not ossified through business relationships and the impossibility of making major changes in the protocols and infrastructure, it would be possible to undertake radical changes to support new challenges.

The new challenges offered include IoT, the need for content providers to have greater control over the quality of data transmission, and the unique service demands of new applications, particularly gaming. The result has been the flattening of the Internet, followed by the emergence of bypass networks — ultimately leading to the fragmentation of the Internet into many different networks.

Is the author correct? It seems the Internet is, in fact, becoming a group of networks loosely connected through IXPs and some transit providers. What will the impact be on network engineers? One likely result is deeper specialization in sets of technologies — the "enterprise/provider" divide that had almost disappeared in the last ten years may well show up as a divide between different kinds of providers. For operators who run a network that indirectly supports some other business goal (what we might call "enterprise"), the result will be a wide array of different ways of thinking about networks, and an expansion of technologies.

But one lesson engineers can certainly take away is this: the concept of agile must reach beyond the coding realm, and into the networking realm. There must be room "built-in" to experiment, deploy, and enhance technologies over time. This means accepting and managing risk rather than avoiding it and having a deeper understanding of how networks work and why they work that way, rather than the blind focus on configuration and deployment we currently teach.

Written by Russ White, Network Architect at LinkedIn

Follow CircleID on Twitter

More under: Networks

Categories: News and Updates

Group discusses Dot-Brand domains in Barcelona

Domain Name Wire - Tue, 2018-10-30 15:00

Angie Graves reports on the Brand Registry Group meeting in Barcelona last week.

Cole Quinn (Microsoft) comments during the Brand Registry Group meeting in Barcelona October 25. Photo courtesy Martin Sutton/Brand Registry Group.

What better way to break up the monotony of dry ICANN policy sessions than a gathering with speakers addressing real-world practices and marketplace challenges?

On October 25, the Brand Registry Group (BRG) hosted a seven-hour program for BRG members, brand registrants and consultants, presenting case studies, histories, timelines, and opportunities. Along with the BRG, and cooperation from the Domain Name Association, sponsors of the event were Neustar, Authentic Web, Lemarit, SafeBrands, Valideus, and Afilias.

The goal was to share education and increase awareness of .brands, and the value of the BRG, which was formed in 2013 by a number of companies who saw benefit in collaborating in the dot-brand TLD space.

Judging from the questions posed during the session, people are interested in understanding best practices for use of existing new gTLDs, potential timelines for the next round of new gTLD registrations, and challenges in managing corporate gTLD and domain name portfolios.

The program covered territory useful to registrants of unused brand TLDs, as well as to companies and their advisors considering applying in the next round. Those who made bold moves explained lessons learned, as well as potential landmines and ways to avoid them. The format was a mix of presentations, panels,  discussions, and question-and-answer periods.

BRG President Cole Quinn from Microsoft kicked off the event by sharing his experience as the manager of Microsoft’s domain name portfolio, and how the BRG inspired him in his role, provided access to expertise for Microsoft TLDs, and gave him the language needed to effectively communicate the value proposition of gTLDs with division leadership teams within Microsoft. Quinn explained how Microsoft runs its corporate domain portfolio and gave a sneak peek into its tool showing workflows and automation.

Participants asked what defines “success”, as well as the challenges in communicating internal successes, especially for the benefit of the ICANN community and Brands community, who are both making decisions about the future of the new gTLD program. They acknowledged that community perception is mixed on the question of the success of the first round of new gTLD registrations.

A look back at the first round of brand gTLD registrations led to a discussion of lessons learned, an analysis of the ways consultants and applicants approached the program, hurdles that were overcome, and challenges that became apparent during the process.

A snapshot of the current state of the .brand TLD space was presented, along with details of hurdles that are yet to be overcome, and examples of helpful sites dedicated to tracking progress in the space.

Other presenters shared details of their experiences as managers of their own corporate registries, including Cecilia Smith of 21st Century Fox, Kevin Audritt from HSBC, Google’s Ben McIlwain, and Kate Kahle from CERN. Sharing not only their experiences so far, but their ideas and plans for development going forward, interest from the attendees was apparent, with each presentation generating additional questions about details.

The session concluded with an in-depth look into needs identified for a success in the next round, and Jeff Neuman presented an estimated timeline and the work needed to be done within the ICANN community before the next round can be launched. Best case scenario for the next round of applications? Late 2021.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. New TLDs this week: .Fashion, .Garden and .FRL
  2. How much it costs to run a domain name registry
  3. Local Popularity & Global Influence: Nations and nTLDs
Categories: News and Updates

Valuable three-letter .Com domains stolen

Domain Name Wire - Tue, 2018-10-30 12:29

NNN.com, Wok.com among domains man claims were stolen.

A Japanese man alleges that valuable domain names were stolen from his registrar account and transferred to another registrar. These aren’t run-of-the-mill three-letter domains, either. Among the domains he says are stolen include nnn.com and wok.com.

Yoshiki Okada filed (pdf) the in rem action in U.S. Federal District Court in Virginia where the .com registry Verisign is located. Okada claims that eol.com, fde.com, jol.com, nnn.com, olp.com, tang.com, wok.com, wtv.com, and zhang.com were stolen from him.

It’s unlikely that the alleged thief will show up to represent himself in Virginia. In most cases, the judge in this type of lawsuit will accept a default judgment and order Verisign to transfer the domains. The exception is when a stolen domain has been re-sold, in which case the buyer might try to defend the domain name.

 

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. 8 Clues a Domain Name is Stolen
  2. Company sues over stolen domains after losing UDRP
  3. Lawsuit filed to recover stolen domains: 5678.com, 26266.com and Manhua.com
Categories: News and Updates

Over the Top Services at the ITU PP-2018: Considering the Pittsburgh Massacre

Domain industry news - Mon, 2018-10-29 20:36

This past Saturday, a self-professed neo-Nazi massacred eleven worshipers at synagogue services in Pittsburgh. The killer was reported to have lived on and was incented by an "Over the Top (OTT)" service purposely established to facilitate extremist activities known as Gab. Within hours, the cloud service providers hosting their services announced they would no longer provide hosting services. Presumably, the threat of both potential civil litigation liability among other penalties, as well as adverse publicity, provided the motivation.

Meanwhile, a third of a world way in Dubai, representatives of the world's nations began assembling for the ITU 2018 Plenipotentiary Conference and considering a proposed treaty instrument resolution on OTT services from four country blocs (Africa, Arab, Europe, and Russian) plus Brazil and the U.S. The proposals contain a spectrum of views on OTT services — ranging from unbounded exuberance to cautious concern about economic effects. None raised any explicit treatment of the potential adverse societal consequences such as those that manifested themselves in Pittsburgh, in addition to patent adverse national security impacts. None gave any recognition to long-established international treaty instruments proscribing the promotion of religious hatred, racism and xenophobia and inciting related violence over internets.

Somehow over the past two decades, nations, transnational service providers, and international institutions have lost sight in pursuing new network and OTT virtualizations as part of political-economic agendas, that these are not necessarily wonderful "transformative opportunities" for everyone. Indeed, the collateral harms can be grievous and benefit not even "good" or sustainable in any kind of meaningful societal or economic sense. OTTs are inherently subject to profound misuse and exploitations — amplified by transnational use. Indeed, they were used by foreign intelligence services to attack the U.S. electoral process.

None of this is new. More than twenty years ago, the UNHCR held a seminal workshop in Geneva to which the ITU and country missions were invited, and where concern was voiced over the growing use of the DARPA internet platforms by neo-Nazis and other extremists.

What an OTT treaty instrument resolution should contain

PP-2018 participants should minimally consider including in their OTT resolution two significant provisions. One is a specific recognition of the multiple human rights treaty provisions that proscribe the promotion of religious hatred, racism and xenophobia and inciting related violence via internets and requesting Member States take appropriate action. A second is to better balance the OTT benefits with the potential detriments and harmful exploitations of OTT use, and explicitly taking steps to mitigate them such as transnational notice and takedown capabilities, and controls on the use of ephemeral encryption platforms. In general, the concept of "Red Teaming" the introduction of new technologies and services is long overdue in the ITU.

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation

Categories: News and Updates

U.S. Government follows up on “cooling off period” idea for ICANN employees

Domain Name Wire - Mon, 2018-10-29 20:28

NTIA chief wants ICANN to restrict post-ICANN employment.

Following up on remarks made during ICANN 63 in Barcelona last week, National Telecommunications and Information Administration (NTIA) head David Redl sent a letter to ICANN Friday criticizing people leaving the regulator for industry companies.

Redl first mentioned the concern publicly last week in comments during the meeting. His interest in the subject presumably stems from Donuts hiring Akram Atallah as its new CEO. Atallah led the Global Domains Division at ICANN.

Here’s the text of Redl’s letter to ICANN:

NTIA continues to be a strong supporter of the multistakeholder approach to Internet Governance, including through our participation as the U.S. representative to the Governmental Advisory Committee (GAC) at the Internet Corporation for Assigned Names and Numbers (ICANN). Paramount to the success of the multistakeholder approach is trust in the institutions that make decisions about the Internet’s future. While the community has greatly improved ICANN’s accountability through the IANA stewardship transition process, I am writing to raise a concern about an accountability deficit at ICANN.

Recent ICANN senior staff departures have highlighted that ICANN lacks postemployment restrictions. Given that ICANN, through the enforcement of its contracts with domain name registries and registrars, performs an industry self-regulatory function, it is necessary that conflicts of interest or appearance of unethical behavior be minimized. While the United States will recommend this issue be addressed in the third iteration of ICANN’s Accountability and Transparency Review Team (ATRT3), which we expect to have had its initial meeting no later than June 2019, I encourage you to look into this now. One potential fix could be “cooling off periods” for ICANN employees that accept employment with companies involved in ICANN activities and programs. This is an ethical way to ensure that conflicts of interest or appearances of unethical behavior are minimized.

One challenge I foresee is that ICANN sometimes hires people from the industry because of their specialized knowledge. It might be difficult to attract these people if they can’t return to industry companies. ICANN would need to compensate them for the restriction. I also wonder if employment laws in various states and countries would limit the ability to enforce cooling off provisions.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. ICANN board resuscitates .Hospital top level domain name
  2. Court denies Donuts’ request to delay .web auction
  3. Donuts demands $22.5 million for .Web domain name
Categories: News and Updates

IBM to Acquire Red Hat, Plan to Provide Open Approach to Cloud

Domain industry news - Mon, 2018-10-29 19:39

Ginni Rometty, Chairman, President, and CEO of IBM, at right, and James M. Whitehurst, CEO of Red Hat, left, announced, Sunday, October 28, 2018, Armonk NY, that the companies have reached a definitive agreement under which IBM will acquire all of the issued and outstanding common shares of Red Hat.

IBM and Red Hat, leading provider of open source cloud software, have announced that IBM will acquire all of the issued and outstanding common shares of Red Hat representing a total enterprise value of approximately $34 billion. This changes everything about the cloud market, says Ginni Rometty, IBM President and CEO. "IBM will become the world's #1 hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses."

Red Hat will continue to run as a standalone business unit within IBM, the company told reporters and that it "will remain committed to Red Hat's open source ethos, its developer community and its open source community relationships."

Rometty on the next chapter of the cloud: "Most companies today are only 20 percent along their cloud journey, renting compute power to cut costs. The next 80 percent is about unlocking real business value and driving growth. This is the next chapter of the cloud. It requires shifting business applications to hybrid cloud, extracting more data and optimizing every part of the business, from supply chains to sales."

Follow CircleID on Twitter

More under: Cloud Computing

Categories: News and Updates

Internet Society Seeks Nominations for 2019 Board of Trustees

Domain industry news - Mon, 2018-10-29 17:48

Are you passionate about ensuring the Internet remains open, globally-connected, secure and trustworthy - for everyone? Do you have experience in Internet standards, development or public policy? If so, please consider applying for one of the open seats on the Internet Society Board of Trustees.

Founded by Internet pioneers, the Internet Society (ISOC) is a non-profit organization dedicated to ensuring the open development, evolution and use of the Internet. Working through a global community of chapters and members, the Internet Society collaborates with a broad range of groups to promote the technologies that keep the Internet safe and secure, and advocates for policies that enable universal access. The Internet Society is also the organizational home of the Internet Engineering Task Force (IETF).

The Board of Trustees provides strategic direction, inspiration, and oversight to advance the Society's mission.

In 2019:

  • the Internet Society's chapters will select one Trustee;
  • its Organization Members will select two Trustees; and
  • the IETF will select one Trustee.

Membership in the Internet Society is not required to nominate someone (including yourself), to stand for election, or to serve on the Board. Following an orientation program, all new Trustees will begin 3-year terms commencing with the Society's annual general meeting in July 2018.

Nominations close at 15:00 UTC on Friday, December 14, 2018.

Find out more by reading the Call for Nominations and other information available at: https://www.internetsociety.org/trustees

Written by Dan York, Author and Speaker on Internet technologies - and on staff of Internet Society

Follow CircleID on Twitter

More under: Internet Governance

Categories: News and Updates

Amid Shutdown, Gab.com Claims Free Speech Infringement While Many Others View Them as Hate Site

Domain industry news - Mon, 2018-10-29 17:42

The controversial site gab.com has been shut down by GoDaddy and given 2 days to move the domain elsewhere. The deadline expires at midnight tonight Irish time.

In recent days the site has seen itself become increasingly disconnected as various service providers and online platforms including PayPal have shut the door to them. At present the site is displaying this notice:

The text reads:

Gab.com is under attack. We have been systematically no-platformed by App Stores, multiple hosting providers, and several payment processors. We have been smeared by the mainstream media for defending free expression and individual liberty for all people and for working with law enforcement to ensure that justice is served for the horrible atrocity committed in Pittsburgh. Gab will continue to fight for the fundamental human right to speak freely.

As we transition to a new hosting provider Gab will be inaccessible for a period of time. We are working around the clock to get Gab.com back online. Thank you and remember to speak freely.

GoDaddy wasn't the hosting provider for the site and it currently uses CloudFlare's DNS servers, so it's not clear who is the host, as the previous provider Joyent pulled the plug. However, as GoDaddy will be pulling the registration later today the site will need to find a new registrar first.

GoDaddy's notice to GAB is pretty clear (screenshot from their Twitter account which might get suspended):

Registrars like GoDaddy generally avoid getting involved in issues related to "content" on websites, but will when they feel that they are justified in doing so under their terms of service.

Where will Gab.com end up?

It's not clear at this juncture, as with the case of DailyStormer there are very few companies that will want to attract a domain with so much negative attention to their platforms. The domain itself is a 3 letter .com so many in the domain investment space will be watching it like a hawk, as Andrew notes the domain was bought for $200k.

Written by Michele Neylon, MD of Blacknight Solutions

Follow CircleID on Twitter

More under: Censorship, Domain Names, Internet Governance

Categories: News and Updates

Alphabet registers more Waymo domains as it makes progress on self-driving taxis

Domain Name Wire - Mon, 2018-10-29 16:58

Alphabet subsidiary Waymo registers domains in preparation for expanded service.

That’s me next to a Driverless car from Google (Waymo). The Waymo taxis on the road look different.

Self-driving taxis are getting closer to reality.

Alphabet (NASDAQ:GOOGL) has begun testing fares for beta users of its Waymo taxi service. In conjunction with the news about its progress, Waymo registered a handful of domain names:

waymo1.com
waymobayarea.com
waymodaily.com
waymoearlyrider.com
waymoearlyriders.com
waymofleetsupport.com
waymolidar.com
waymolidars.com
waymonext.com
waymoone.com
waymophoenix.com
waymoriders.com
waymoridersupport.com
waymosensor.com
waymosensors.com
waymosupport.com
waymotoday.com

Waymo is primarily testing its service in Phoenix. Early Rider is the name of its beta program that has hundreds of users.

As the company expands its self-driving taxi service to more cities and more riders, domains like WaymoSupport.com are smart defensive registrations.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. Google’s Waymo domain strategy and what it says about domain choices
  2. Google, ABC.xyz, and the growing awareness of new domain names
  3. 2015’s Top 5: Google now part of Alphabet, uses abc.xyz
Categories: News and Updates

Country Code Domains – DNW Podcast #208

Domain Name Wire - Mon, 2018-10-29 15:30

Two guests shed light on ccTLDs.

.Com is king in much of the world, but not in Europe. In many European countries, the country code domain name (ccTLD) is used more often than .com. On today’s show, I talk to two people who live and breathe ccTLDs. First, I talk with Peter Von Roste, who is the General Manager of CENTR, a group that helps European ccTLD managers. We then chat with Jakob Truelsen, the CEO of DK Hostmaster, which manages Denmark’s .DK domain name. Truelsen explains how the group was able to cut down on websites using .DK domains to sell counterfeit goods without wading into the murky area of reviewing content.

Also: Verisign sells security to Neustar, Google tries something new with .New, .AI expired domains, a big domain auction, Zoom.com and much more.

Subscribe via iTunes to listen to the Domain Name Wire podcast on your iPhone or iPad, view on Google Play Music, or click play above or download to begin listening. (Listen to previous podcasts here.)

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. Frank Schilling explains price hike – DNW Podcast #127
  2. How to Sell More Domains with Adam Strong – DNW Podcast #158
  3. Reviewing this year’s predictions – DNW Podcast #161
Categories: News and Updates

Google Adsense now requires site verification

Domain Name Wire - Mon, 2018-10-29 14:11

In a throwback, sites must be validated before showing Google Adsense ads.

Sites must be verified before Adsense shows ads like in the image above. Photo courtesy Google.

Google is now verifying websites before they can use Adsense to monetize content.

In a blog post today, the company said that the process will review sites for compliance with the Adsense Program policies.

There are a lot of policies for Adsense, but I wonder if this change is in part to make sure Adsense ads don’t show up on sites racist or hateful content, or content that incites violence. This has been a hot topic lately after the radicalization of people carrying out acts such as sending pipe bombs and mass shootings.

The verification will also verify that users can own the domain or can modify its content.

Longtime Adsense users won’t find this too burdensome. In the old days, all sites have to be reviewed before they were accepted to show Adsense ads.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. Google Now Targeting Ads to Users, Not Just Content
  2. Adsense as a Platform and What it Means for Publishers
  3. Domain Parking Coming to .Tel
Categories: News and Updates

GoDaddy tells Gab.com to find a different domain registrar

Domain Name Wire - Mon, 2018-10-29 13:50

Registrar tells the controversial social network to take a hike.

A notice that Gab.com says it received from GoDaddy.

Domain name registrar GoDaddy (NYSE:GDDY) has informed Gab.com that it needs to move its domain name to a different company. This is according to a tweet from the controversial social network, which posted the email notice from GoDaddy (pictured).

Gab.com has come under fire after it surfaced that the Pittsburgh synagogue shooter was active on the site. The social network is known for letting anything be published and has become popular with racists and for hate speech.

Other companies pulling the plug on Gab.com include PayPal, Stripe and its hosting providers.

GoDaddy tries to avoid suspending service to domains due to the content on them but has made exceptions. Last year it told the operator of the white supremacist site Daily Stormer to switch registrars after the Charlottesville rally. The registrar said the site was inciting violence, which is against its terms of service.

The operators of Daily Stormer and Gab are actually fortunate that GoDaddy merely asked them to switch providers. When DailyStormer.com moved to Google, the company went a step further by suspending and locking the domain.

Gab.com was purchased for over $200,000 on Sedo. It appears that the social network then acquired the domain on Flippa for $220,000 last month. It had been using Gab.ai for its web address.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. GoDaddy is buying NamesCon
  2. GoDaddy (GDDY) reports earnings, domain revenue of $263.3 million
  3. GoDaddy adding more expired domain registrars, increasing European aftermarket presence
Categories: News and Updates

Kim Komando on Domain Names

Domain Name Wire - Sat, 2018-10-27 21:38

Popular tech writer/podcasts introduces her audience to domain investing.

News stories and podcasts about domain investing that are geared to a mass audience can be hit or miss.

Kim Komando, “America’s Tech Goddess”, just published a podcast that includes a segment about buying and selling domain names for profit.

She has two guests who talk about how to value domains, where to sell them, and how to make money with domains. Kim also discusses how much she paid recently to buy two domains, including PodNet.com.

It’s an interesting high-level introduction to domain investing, and I’m curious what you think about the advice the two guests share. You can listen to the podcast here.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

No related posts.

Categories: News and Updates

Obstacles in OneWeb's Negotiations with Russia

Domain industry news - Sat, 2018-10-27 20:07

This case illustrates the fact that political, security, and financial negotiations may be as difficult as designing satellites and rockets for a would-be global Internet service provider.

OneWeb is investing billions of dollars in a constellation of low-Earth orbit (LEO) Internet-service satellites.

In 2015 they placed launch orders for 21 Russian-made Soyuz rockets.

In 2017, they formed a joint venture with Russian LEO satellite operator Gonets to develop the project in Russia. At that time, Gonets was a subsidiary of Roscosmos, the Russian State Corporation overseeing and implementing the Russian space industry. OneWeb had a 60% interest in the joint venture.

This week Reuters reported that OneWeb is relinquishing its majority stake in the venture — Gonets intends to increase its stake to 51 percent.

I wonder why.

Speaking at a conference in Moscow, Federal Security Service (FSB) official Vladimir Sadovnikov objected to the project for security reasons. He feared that "Some of Russia's regions would become totally dependent on a foreign satellite service" and added that Moscow had not received any conclusive evidence that OneWeb's satellites would not be used for intelligence gathering.

(He also revealed his ignorance by apparently not understanding the difference between Iridium and OneWeb).

I wonder if the security concern is genuine — OneWeb has decided to forgo inter-satellite links in favor of routing all traffic through a system of 40 terrestrial gateways, allowing a nation to know the path of traffic into and out of their territory. Are they concerned about the possible detection of sources of trolling and hacking?

Sadovnikov added a political dimension saying Russia favored setting up a similar network partnering with India, China and countries which he described as non-aggressive and China has pitched a 1,000 LEO satellite project to Russia.

An unnamed source at the FSB also mentioned politics, saying "OneWeb is an important project for Roscosmos and Russia's space industry, but national security issues come first. There are many doubts regarding that project, especially because of the sanctions against us."

Spectrum is another stumbling block. OneWeb's request to receive a frequency band in Russia was refused and a source at the Ministry for Digital Development and Communications said they would be given permission after legal issues regarding the joint venture were completed. Given Russia's reputation, one can't help wondering whether the hangup has something to do with payoffs.

Another possibility is convoluted economic infighting within Russia. Gonets' Wikipedia page says it began as a Russian Federal Space Agency program, but in 1996 it was privatized and operated by Gonets SatCom, which was controlled by ISS Reshetnev. In 2017 Roscosmos acquired 80% of Gonets from ISS Reshetnev. Wikipedia is not a definitive source and I know nothing of the history of these organizations, but this sounds like it could be the kind of oligarchy-creation manipulation that occurs when state property is privatized. (The ownership of Cuban ISP Etecsa raises similar questions).

Perhaps there were management problems. Initially, launches of production satellites were planned to begin last May, then the date slipped to first quarter 2018. The current schedule calls for the launch of test satellites on February 7, 2019.

Regardless of the motivation for restructuring the OneWeb/Gonets venture, there is a mismatch in the aspirations of a global ISP and nationalistic governments. This case illustrates the fact that political, security and financial negotiations may be as difficult as designing satellites and rockets for a would-be global ISP.

For background on OneWeb and other low-Earth orbit satellite Internet service projects, click here.

Gonets home page, 8/10/2018. It was removed earlier this week.
The Russian home page has also been removed. Last archived copy 4/10/18.

Written by Larry Press, Professor of Information Systems at California State University

Follow CircleID on Twitter

More under: Access Providers, Broadband, Wireless

Categories: News and Updates

Has Internet Governance Become Irrelevant?

Domain industry news - Fri, 2018-10-26 18:47

A panel session has been scheduled at the forthcoming Internet Governance Forum (IGF) in Paris in November that speaks to the topic that Internet Governance is on a path to irrelevance. What's this all about?

When the Internet outgrew its academic and research roots and gained some prominence and momentum in the broader telecommunications environment it found itself to be in opposition to many of the established practices of the international telecommunications arrangements and even in opposition to the principles that lie behind these arrangements. For many years, governments were being lectured that the Internet was "special", and to apply the same mechanisms of national telecommunications and trade regulations to the Internet may not wreck the entire Internet, but they would surely isolate the nation that was attempting to apply these unnatural acts.

Within this broad category was the notion that conventional means of conducting trade in services was not applicable to the Internet. While an early mantra of "The Internet must be free" quickly foundered when it encountered pragmatic realities, the next mantra of "Don't tax the Internet" gathered significant momentum. What was meant here was an admonition to governments not to attempt to unduly constrain the flow of data, as such actions would imperil the future value of the Internet.

But while the Internet might have been "special" in the 1990s, such a characterization was unsustainable in the longer term. In 2003 and 2005 the United Nations hosted the two-part World Summit on the Information Society (WSIS). It was clear by the time of the millennium that the previous regime of national telephone operators and the treaties that governed the international aspects of this global service were rapidly becoming sidelined if they had not been sidelined already. The Internet was sweeping all before it and each time it engaged with another sector it appeared to come out of the encounter as a clear victor. The Internet might still be special, but by the millennium it was recognized that it was not always special in a good way.

This WSIS summit was in the context of the emergence of the so-called information society and a recognition of a widening "digital divide" where richer nations were in an obvious position to exploit the possibilities that opened up with the combination of abundant computation and communications services and thereby amass further wealth, while poorer nations yet again found themselves on the other side of the divide. Far from being a tool to help equalize the inequities in our world by allowing all to access information, education and open global markets for their services, the Internet appeared to be yet another tool to further emphasize this divide between rich and poor.

The United States was a focal point in these discussions. At the time the Internet was still strongly associated with the United States, and the US had spent much of the previous decade both promoting its benefits and profiting from the revenues flowing into US companies. This promotion of the Internet and the free flow of information was certainly not without elements of self-interest on the part of the US, as it appeared that the interests of the new corporate behemoths of the Internet and the geo-political and geo-economic aspirations of the US appeared to have much in common.

However, it's often difficult to tackle the larger picture in these large-scale international forums, so it was no surprise to see attention turn to the individual elements that were contained within this picture. One of these elements that became a topic of discussion in its own right was the status of the body that oversaw the Internet's protocol parameters, including the names and IP addresses, that are used as part of the central core of the Internet. This function, the Internet Assigned Numbers Authority (IANA) was originally part of the US Defence Advanced Research Project Agency's funded activities. After a few more changes within the US Government agency landscape responsibility for this function was shifted to a self-funded mode operated by a private sector entity, ICANN, with some level of US engagement remaining in place. This was variously portrayed as a control or as a safeguarding measure. Irrespective of the nature of the motivation, the result was that the National Telecommunications and Information Administration, part of the US Department of Commerce, oversaw a contract between the US government and ICANN regarding the operation of the IANA function.

At times perceptions matter and the lingering perception here was that the Internet was still seen to be essentially under the control of a single sovereign state.

This unique US role was always going to be a problem for other nations. The international telephone and postal networks were governed by international treaty instruments that had been in place for more than a century. To have a single nation-state positioned at the apex of this Internet structure was, to say the least, controversial. Naturally, this was a major topic in 2003 at the first WSIS gathering. The UN Secretary General at the time, Kofi Annan, convened a Working Group on Internet Governance (WGIG), a grand title which either conflated this topic to an even greater level of prominence or appropriately stated its central importance to the entire set of concerns with the structure of the Internet at the time. Again, opinions vary here. There was no clear consensus coming out of this WGIG activity, and the 2005 WSIS gathering could not reach any form of agreement on this matter.

During the WSIS process, the US apparently refused to consider any changes to its pivotal role in the management of the Internet's protocol parameters. The WSIS summit eventually agreed on a compromise approach that deferred any determination on this matter and instead decided to convene a series of meetings on the underlying policy principles relating to Internet Governance. Hence we saw the inauguration of a series of Internet Governance Forum (IGF) meetings. These forums were intended to be non-decisional forums for all stakeholders to debate the issues. Originally intended to be convened for a period of five years, culminating in the fifth IGF meeting in Vilnius, Lithuania in 2010, it has continued with two further five year extensions of its mandate, and the thirteenth IGF will take place in Paris, from 12 to 14 November 2018.

Internet Governance Forums

Even within its limited objectives, the IGF would find it challenging to claim universal success in achieving its mission.

The IGF did not manage to address the underlying tensions relating to the pivotal position of the US on the Internet. In 2011 we saw the IBSA proposal (called IBSA because of a summit convened by India, Brazil and South Africa) for a UN committee in Internet Related Policy. In 2013, as a reaction to the US surveillance stories being publicly aired on Wikileaks, a number of internet organizations, including ICANN, the RIRs, and the IETF released the "Montevideo Statement" calling on the US to step back from its central role. The US surveillance disclosures also appeared to be a major factor in Brazil's sponsorship of the 2014 netMundial initiatives, which also appeared to have the support of ICANN. Once more the call was for the cessation of the US control over the Internet's protocol parameter function. At much the same time Edward Snowden released a set of material that documented how US agencies were undertaking of widespread surveillance using the Internet.

These Wikileaks and Snowden disclosures weakened the US resolve, and in October 2016 the previously unthinkable happened. The US government signed away its functional role and passed control of the protocol parameter function to an independent ICANN.

If the IGF was the forum to discuss the public policy issues related to the privileged position of the US Government with respect to the Internet, then the principal rationale for the IGF also finished in October 2016. In theory, at any rate, the US no longer claimed the ability to place its finger on the scale with respect to the carriage of these matters.

On the other hand, this is perhaps a far too narrow a definition of the role and scope of the IGF. The IGF process has managed to gather a more sophisticated shared understanding of the layers within the Internet and the ways in which these various components both share common objectives and create tensions when competing to achieve similar objectives. The elements of carriage networks, consumer devices, servers, and service delivery networks, applications, and application behaviors all operate in a semi-autonomous manner. The previous model of the locus of control of an entire service environment sitting within the telephone company within each nation-state was not repeated with the Internet. The Internet has exposed each of the various component service activities as discrete activities, and instead of orchestrating these components within the framework of the procurement processes of the larger service entity, a variety of new markets have been exposed: technology standards, , and mobile services, computers in all forms from handsets to servers, applications, service providers and content publishers all operate semi-autonomously, and the orchestration of their actions is through markets and market interactions. The Internet is not operated by a single service delivery company, nor is it a defined destination. It is a series of inter-twined markets. The implication for governance processes was profound, and the IGF has managed to both expose this change and steer a constructive path of commentary and dialogue on these changes as they have happened.

Internet Governance Today

I'd like to nominate three major themes of national and international interest in today's Internet that have some relevance to the topic of Internet governance.

The first is the issues that can be summarised as the "digital divide." There are still the haves and have-nots across the full spectrum of our world. The digital divide is as big as it ever was and there is no visible movement in directions that would ameliorate the societal impacts of these changes. If anything, this divide has further broadened in scope. In absolute terms, it may be the case that more individuals have some form of internet access than was thought could possibly be achieved even 10 years ago. But today that's still only one half of the world's population, and the other three billion people are isolated from the mainstream. The divide also operates across other dimensions, including the cost of access, the quality and speed of access, the accessibility of information, the extent to which goods, services and information are accessible using a local language. They all form subtle aspects and not so subtle aspects of digital exclusion.

The second theme is also not a new theme, but it has dramatically increased in importance in the past two decades. Its components have various labels, including Cyber Security, Malware, Abuse, Spam and Viruses. It can be summarised in the observation that today's Internet is a toxic place that not only provides a haven for various criminal and fraudulent activities but also provides haven for darker actions encompassing the current set of concerns relating to terrorism and cyber-offensive tactics from state-based actors. The uncomfortable observation is that technology-based counter-measures may be failing us and the fabric of our society seems to be very vulnerable to concerted hostile cyber-attack. We've adopted strong encryption in many parts of the environment as a means of protecting users against various forms of organized surveillance, but in so doing we've turned off the lighting that would otherwise expose various acts of malfeasance to our law enforcement bodies. We have had to make some tough decisions about balancing personal privacy and open attribution. But this lack of clear attribution and greater ability to embed communications behind strong encryption means that various forms of policing this digital world have become expensive, frustrating and ultimately very selective in its application.

The third theme lies within the changes occurring within the Internet itself. In recent years we've seen the proliferation of content distribution networks that attempt to position all of the data and services that any user would request as close as possible to the user. It used to be the role of the network to bring the user to the content portal, whereas these days we are seeing content shifting itself ever closer to the user. In and of itself that's a relatively significant change to the Internet. The public carriage component of the Internet is shrinking and being replaced by private feeder networks that service these rapidly expanding Content Distribution Networks (CDNs). The bigger question concerns the residual need for global names and addresses in this CDN-centric environment. The Internet is no longer a telecommunications network that carries user traffic across a common network. Today's Internet is a content distribution network that is very similar to a television broadcast network where the transmission component is limited to the last mile access network. The essential difference here is that on the Internet each user can define their own program.

One possible response to these concerns is the perception that these situations are instances of collective failure of the Internet Governance framework. Allowing the private sector unfettered control of the public communications space has produced very mixed results. Yes, the obsessive concern with catering precisely to what users want has produced a remarkably efficient and capable supply chain that can bring the economies of massive scale to market of a single unit, and this is a modern-day marvel. But at the same time, the private sector is largely uninterested in the general health and welfare of the larger environment and the internet appears to be the victim of such collective neglect.

The public sector's forbearance with the cavalier attitude shown by various Internet players may be reaching a breaking point. The EU initiative with General Data Protection Regulation (GDPR) is a clear signal that the honeymoon with technology is over and various national regimes clearly want to see a more responsible and responsive attitude from these players to public concerns. Doubtless, we will continue to see fines being set at levels intended to be eye-watering for even the largest of players. While this measure has the unintended side-effect of eliminating the smaller players from the market and potentially stifling competition, a major public sector goal is to bring some sense of broader social responsibility back to the major players. This regulatory stance will no doubt continue in both the EU and in many other regimes.

But is this increased national engagement a failure of the Internet Governance framework or a more conventional role of public sector regulation of a market? Private corporate entities have a primary duty to their shareholders and do not necessarily have the same over-arching obligation to the public good. If self-interest and public interest coincide, then that is a wonderful coincidence of fortune, but when they differ, corporate self-interest necessarily wins. It is naive to expect that any messages of constraint and prudence to the private sector would be heeded unless it has the authority of regulatory impost with some form of punitive enforcement measure.

If governments are feeling emboldened to enact regulatory measures for an industry that until now enjoyed some level of immunity from conventional social responsibilities, then how do these same governments feel about the actors that look after the elements of Internet infrastructure?

IANA and its Fellow Travellers

A highly visible part of the US position with respect to the Internet was the defining of the IANA function as an activity performed under the aegis of the US Government by a contracted agency. There are three activities that are loosely bound within the IANA role. They encompass the carriage of Internet addresses, Domain Names, and IP protocol parameters. Let's quickly look at the current position of these three activities, and look at their relationship to the Internet Governance dialogue.

The IETF started in the late 1980's with all the youthful hubris and enthusiasm of any new entrant to the field of technology standards. Loudly criticizing the staid incumbent standards bodies as being production factories of "paperware about vapourware," the IETF paraded its difference loudly and proudly. The IETF was motivated by a determination to quickly produce specifications that allowed for interoperable implementations of useful functions as its prime role. They "rejected Kings, Presidents and voting, and believed in rough consensus and running code." They were not there to create standard specifications from offered technology but saw themselves as the architects and engineers of the Internet. Their self-perception of their role was to develop technology, and do so quickly and efficiently,

As the IETF matured it became more like many other technology standards bodies, but there have been moments when the spark of the early days has returned. The IETF's reaction to the Snowden leaks was to regard the surveillance actions of national security agencies as a form of attack on the IETF's protocols. The response was one of taking the IETF's protocols and adding strong encryption wherever possible. The results have been rapid and profound. The web now uses encryption as the de facto standard. The IETF has produced standards that encrypt mail, messaging, and even time synchronization. They have been thorough in taking even the last vestiges of traceable information and defining ways to encrypting it.

But at the same time, the IETF has not been able to provide a technology solution to all perceived issues and problems relating to abuse of the Internet's technology. If SPAM was a technology battle, then the IETF has lost it. No matter what the latest solution has been over the past two decades, the spammers have been able to work around it. The disturbing Denial of Service attack space is another illustration of how it is possible to turn these technologies around and turn them into attack vehicles. There is a pervasive feeling of vulnerability and a sense that technology-based solutions are not offering the needed reassurance. These are hard problems and it is completely unfair to suggest that the IETF is responsible for these issues and unfair to believe that the IETF should've had a solution to each and every one of them. However, the IETF was adamant in the past in saying to others "leave us alone, we know what we are doing." That was, as it has turned out, a bit of a stretch! Our protocols are not resilient enough and we are now seeing players break away to create their own protocols without necessary seeking IETF permission.

What about ICANN and the Domain Name System?

It's challenging to summarise the issues into a couple of short paragraphs, but one could trace much of the namespace issues back to the original concepts in the early days of the Internet that adopted a hierarchical namespace with a deliberately small set of top-level names. Country codes were added to this name set, but at the same time, these "stateless" names continued. So-called "Name Envy" quickly followed as others wanted to reap the benefits of these generic top-level domain names and the pressure to add more such names to the DNS has continued ever since.

However, there is a contrary view that whatever the Internet may need today, more generic top-level domain names are not on the list of actions that would help the Internet. As we see more of these top-level domain names added into the root zone, we see more domain names that have little intrinsic value. There is a widespread perception that these new generic top-level domains represent havens of online abuse and malfeasance. ICANN, as the policy forum that has the carriage of stewardship of the namespace, appears to be largely impotent in being able to stop this behavior and incapable of stopping itself from allowing applicants to pay ICANN large sums of money to generate even more such unnecessary top-level domain names.

Where does this end? We know that we cannot support a DNS infrastructure with billions of names in the root zone. But precisely how many we could support before the DNS system starts to fall apart is an unknown number. Rather than simply stopping this process of adding more top-level domains and working within the parameters of what we have, ICANN appears to be set on an inexorable path to expand the root zone of the DNS and do so without any endpoint in sight. There is no sense of constraint in their activities, and one wonders if the only point of constraint is the point when we've managed to completely destroy the utility of the DNS and its name system.

What about the Regional Internet Registries and the Internet's address infrastructure?

The run-down of the address pools associated with IPv4 was a surprise to many. It's hard to see this as a fault in the RIR's carriage of the administration of the address space, but it is seen as a larger systemic failure. The IETF had identified the forthcoming exhaustion of the IPv4 address space as an issue almost thirty years ago, and to avoid this scenario it designed a protocol that had a vastly larger address space to accommodate future growth. Following this IETF lead, the Internet industry was meant to have behaved prudently and transitioned to use IPv6 long before IPv4 had run out. Obviously, this has not happened, and we are still largely using IPv4 long after the run-down of available IPv4 address pools.

But perhaps while it is challenging to make a case that this represented a fault in the RIR's function, the IETF does not escape some level of criticism here. In defining IPv6, the IETF ignored of the primary drivers of the success of a network, commonly described as "connectivity is its own reward" and produced a protocol that was incompatible with its predecessor, thereby defining an entirely new network. Now, a system where growth is proportional to its size is the definition of exponential growth and after years of the IPv4 Internet, the new network faced an impossible chase. The RIRs, as stewards of the distribution of number resources, have no capacity to coerce the adoption of this new protocol in any way that is sufficient to ensure its immediate deployment. Instead, the entire transition is a protracted waiting game with no obvious end in sight.

Is Internet Governance Irrelevant?

I'd like to think it's not the case. I'd like to think that the principles of an open and accessible technology foundation are an intrinsic component of open accessible and healthy societies. I'd like to think that the principles of accessibility, transparency, and diversity that are part of the mode of operation of the IGF are valuable principles and should ensure a healthy and robust debate on the various topics of Internet governance. I believe that the IGF has been of assistance to the increasing level of shared understanding of the Internet, in both its strengths and its weaknesses. I suspect that Internet Governance will be irrelevant only we let it become so. Like any open cooperative effort, it requires continual care and attention if it is to continue to flourish.

But there is another side to an answer to this question. We are embarking on changes in our society which are as dramatic and even as traumatic as the industrial revolution. Such revolutions leave a path of social dislocation and uncertainty in their wake, and this information revolution is no exception. It is perhaps unsurprising that nation-states tend to be more assertive in such situations as they try and mitigate some of the worst excesses of such social disruptions. One side-effect of this increasing nationalistic stance is that the various international efforts, both regional and global, tend to be regarded with increasing levels of distrust from these national regimes. In times of uncertainty and stress, nations naturally try to raise the drawbridge and attempt to insulate themselves from such disruptions by asserting greater levels of control within their own national realm.

The industrial revolution was certainly triggered by the refinement of the steam engine, but the social revolution was far larger in scope than the invention of a simple mechanical device. In a similar line of thought, maybe it's not the Internet or its governance that lies at the heart of many of today's issues. Maybe it's the broader issues of our enthusiastic adoption of computing and communications that form a propulsive force for change in today's world.

Written by Geoff Huston, Author & Chief Scientist at APNIC

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Policy & Regulation

Categories: News and Updates

.DE domains lead the way in $350k Sedo auction

Domain Name Wire - Fri, 2018-10-26 15:07

German country code domains sizzle in monthly auction.

Sedo’s October Great Domains auction ended yesterday with a higher-than-usual tally of $350,000. It was the highest-grossing Great Domains auction of 2018.

The top sale was wg.de at €102,501. WG is short for a flat/apartment share or dorm in German. There were 35 bids and the domain easily eclipsed its €65,000 reserve price.

NFC.com sold for €68,200. While higher than most three-letter domain sales to investors, NFC is short for near-field communications. NFC is used for contactless payments such as Apple Pay.

NOVO.com brought home a top bid of $52,999. Novo is a very popular name for companies. Just Google it.

Other top names include grillen.de (Barbecue) for €41990, wc.de (Water Closet) for €29000 and yal.com for $20,500

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. Next Navigation Sells Pathology.com and More
  2. Moms.com sells for $252,000 in Sedo Auction
  3. Two domain name auctions to watch this week
Categories: News and Updates

Law Enforcement Agencies Will Have Authority on Registries and Registrars

Domain industry news - Fri, 2018-10-26 14:22

This one is for European Law Enforcement Agencies only, and no matter what the GDPR says.

Accessing Whois information and acting on a litigious domain name is becoming a nightmare for law enforcement agencies. Law enforcement agencies must have an access to the information provided by registrants in the Whois database and, in specific cases, have authority to act FAST on a domain name. The EU has a solution for this and it's coming in 2020. Is this mechanism welcomed now that the GDPR is causing problems for law enforcement agencies trying to do their job? I'd say… yes it is.

What it is

The Regulation (EU) 2017/2394 of the European Parliament and of the Council of 12 December 2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004. It is directly applicable in all Member States.

What it does say

A few things that I extracted from the regulation:

  1. Competent authorities should be able to request any relevant information from any public authority, body or agency within their Member State, or from any natural person or legal person, including, for example, payment service providers, internet service providers, telecommunication operators, domain registries and registrars, and hosting service providers, for the purpose of establishing whether an infringement covered by this Regulation has occurred or is occurring.
  2. Where no other effective means are available to bring about the cessation or the prohibition of the infringement covered by this Regulation and in order to avoid the risk of serious harm to the collective interests of consumers: where appropriate, the power to order domain registries or registrars to delete a fully qualified domain name and to allow the competent authority concerned to register it.

Coming in 2020

This Regulation shall enter into force on the twentieth day following that of its publication in the Official Journal of the European Union. It shall apply from 17 January 2020.

The regulation is available here.

Written by Jean Guillon, New gTLDs "only".

Follow CircleID on Twitter

More under: Domain Names, Law, Policy & Regulation, Registry Services, Whois

Categories: News and Updates

Verisign adds nearly 2 million .com/.net domains in Q3

Domain Name Wire - Fri, 2018-10-26 13:32

.Com continues to chug along.

Verisign (NASDAQ: VRSN) reported third-quarter 2018 results yesterday.

On the domain name side, the company processed 9.5 million new .com/.net registrations during Q3. After accounting for deletions, 1.99 million .com/.net domains were added to the domain base. At 151.7 million, it’s up 4% year-over-year.

The results exceeded Verisign’s guidance of 1.3 million to 1.8 million for the quarter.

The company expects to add 0.9 to 1.4 million .com/.net domains to the base during Q4 of this year.

After this quarter, the company will change how it provides guidance on net adds. It will switch from quarterly guidance to annual guidance.

As I’ve noted before, it’s fairly easy to pull levers to hit this guidance mark. Switching to annual guidance will allow the company more time to decide which levers to pull for better long-term performance.

With regards to the Cooperative Agreement between Verisign and the NTIA, Verisign CEO James Bidzos said the company is confident that an amended agreement can be executed before it expires at the end of next month.

© DomainNameWire.com 2018. This is copyrighted content. Domain Name Wire full-text RSS feeds are made available for personal use only, and may not be published on any site without permission. If you see this message on a website, contact copyright (at) domainnamewire.com. Latest domain news at DNW.com: Domain Name Wire.

Related posts:
  1. .Com contract extension shorter than Verisign expected
  2. How Verisign hit its Q1 growth numbers
  3. Verisign sells security business to Neustar for up to $120 million
Categories: News and Updates

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer