Domain industry news

Syndicate content CircleID
Latest posts on CircleID
Updated: 8 hours 11 min ago

France to Stop Using Google as Part of Its Plan to Establish Digital Sovereignty

11 hours 4 min ago

The 2013 NSA revelations by the American whistleblower Edward Snowden was a stern wake-up call for French politicians. France and the European Union shared increasing concerns of becoming "digital colonies." Clothilde Goujard in a report published in Wired says: "France is working hard to avoid becoming a digital colony of the US or China. Last month, both the French National Assembly and the French Army Ministry declared that their digital devices would stop using Google as their default search engines. Instead, they will use Qwant, a French and German search engine that prides itself for not tracking its users. ... These days, hearing French politicians taking a bellicose stance on technology is becoming increasingly frequent."

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation, Privacy

Categories: News and Updates

Google Building 600 Million Euro Data Center in Denmark Powered by Renewable Energy

Tue, 2018-11-20 19:45

Google will invest 600 million euro (roughly $700 million USD) to construct its first Danish datacenter. The company says the site will be matching its energy use with 100 percent carbon-free energy. Google further notes that the data center will be among the most energy-efficient data centers in Denmark to date by taking advantage of advanced machine learning ensuring every watt of electricity counts. This will be Googles fifth data center in Europe with others located in Ireland, Finland, the Netherlands and Belgium. "The Nordic countries, which can generate electricity relatively cheaply from renewable sources such as hydropower and wind, have long been a magnet for heavy power-using industries, but are now attracting power-hungry data centers," Reuters notes.

Follow CircleID on Twitter

More under: Data Center

Categories: News and Updates

DOHA and ZIPPO Make Forty Five

Tue, 2018-11-20 17:06

Forty five what? Forty five abandoned top-level domains. On November 7, ICANN received a notice from the Communication Regulatory Authority of the State of Qatar that they are terminating the registration agreement for .DOHA. Two weeks before that, the Zadco company terminated .ZIPPO.

In addition to the $180,000 application fee, applicants had to hire consultants, make arrangements with back-end operators, go through the certification process to get their TLD online. I'd say $500,000 is a reasonable estimate of the cost for each TLD. That means 45 abandoned TLDs is over $20 million spent to accomplish nothing.

The list of terminated domains keeps growing, if anything at an increasing rate. There were only 9 terminations in all of 2017, but this year there were three in January, one in February, one in May, four in June, one in July, five in September, two in October, and one so far in November. Terminations come from a wide range of organizations ranging from McDonalds' restaurants to Allstate insurance, to Spiegel publishing, to the Qatar communications ministry, the German post office. and even Bond University in Australia. (What were they thinking?)

While I can't feel very sorry for the big presumably sophisticated organizations that wasted their money, it does make me wonder how much other destroyed value is hiding in new TLDs. The zone files for over 500 active new TLDs contain less than 10 names each, even though each is paying the ICANN fixed fee of $25,000/yr. That's $12.5 million of ICANN's budget from zombie TLDs. No doubt a few of those still have plans to do something, but even so, we're looking at a lot of failures.

Written by John Levine, Author, Consultant & Speaker

Follow CircleID on Twitter

More under: Domain Names, ICANN, New TLDs

Categories: News and Updates

EU Should Not Be Setting US WHOIS and Privacy Policy, Says MPAA

Tue, 2018-11-20 15:48

The Motion Picture Association of America (MPAA) in its recent submission to the National Telecommunications and Information Administration (NTIA) has raised a stern objection regarding ICANN's attempt to adhere to the EU's General Data Protection Regulation (GDPR), stating that the temporary specification had gone "well beyond what the GDPR mandates." From the submission: "The European Union should not be setting U.S. WHOIS and privacy policy. Moreover, with growing concerns over illicit behavior on the internet, now is the time to increase online transparency, accountability, and trust — not diminish it. The MPAA, therefore, asks that the federal privacy approach prioritize efforts to ensure that certain basic WHOIS information remains publicly available and that any information that the GDPR does require to be removed from public access still be available to third parties with legitimate interests through a reasonable, timely, and effective process. Such efforts should include Administration support for solutions through the multistakeholder process, for ICANN assumption of WHOIS under a unified access model, for WHOIS access requirements in trade agreements, and for federal legislation requiring registrars and registry operators to continue providing lawful access."

But ICANN cannot be faulted for being overly conservative to avoid violating GDPR, which can lead to fines of up to four percent of an entity’s annual global revenue, says Alan Behr, a partner at Phillips Nizer. (via IPPRo)

Follow CircleID on Twitter

More under: DNS, Domain Names, ICANN, Law, Policy & Regulation, Privacy, Whois

Categories: News and Updates

Stop Using the Term "Open Internet"

Sat, 2018-11-17 19:12

Over the past few years, the term "open internet" has become popular among politicians in Washington and Europe. It is bandied about in political pronouncements that assert that everyone needs to somehow support the open internet without ever actually defining it. It is sometimes used as a synonym for Net Neutrality.

In fact, it is a bogus public relations term that is rather like saying you believe in the Tooth Fairy. Furthermore, it vectors the focus away from more serious needs such as effective cybersecurity defense, and all too evident threats by adversaries who are using the open attributes of some internets to mount attacks on facilities and data repositories. It is time to stop using the term.


The term "open" in the context of communication networks invokes potentially hundreds of different parameters. The first significant use of "open" occurred forty years ago with the emergence of the massive Open Systems Interconnection (OSI) initiative that governments and industry mounted in the ITU, ISO, and numerous other venues. The work led to massive numbers of standards, including in the U.S., the Government Open Systems Interconnection Profile (GOSIP) specifications that profiled open networking products. Indeed, an open and relatively secure OSI internet was rolled out using the internet protocol CLNP. Ironically, it was the DARPA internet for closed and carefully monitored R&D networks based on TCP/IP in the early 1990s that were pushed out into the public infrastructure by Clinton and Gore without any real security that has been recently politically advanced for openness.

Furthermore, the basic current construct of openness is fundamentally nonsensical. It is exemplified with a hypothetical conversation. "You have a smart phone and lots of computer devices, and probably a home network. Are you willing to allow everyone and anyone in the world to have unfettered access and usage? No?" Well, you get the idea now. It led to the Cato Institute dubbing this as the "What's Yours is Mine" philosophy.

Thus, lies the conundrum and absurdity of advocating unfettered openness of networks and devices. Advocating "openness" is equivalent to suggesting that all computer and network resources belong to everyone in the world for the taking and exploitation. No rational person, organization, or nation is likely to buy into that proposition.


Next, there is the oft-bandied term "internet." There is no singularity that exists as "the internet.". Many different internets have existed since Louis Pouzin developed the concept in France nearly fifty years ago, and will continue to exist. Indeed, after years of legal wrangling over IPR ownership of the term INTERNET, the U.S. Patent and Trademark Office in its landmark year 2000 order, legally recognized that INTERNET is a meaningless term and free for anyone to use for anything they choose.

Throughout the world, there are countless networks that at multiple levels are enabled to process and route digital packets among devices within their architectures. Many have varied gateways with other networks. Innumerable internets and related services coexist as virtual overlays among them. Increasingly in the rapidly emerging NFV-SDN-5G world, these will be network slices orchestrated from data centres that are gatewayed as needed using the most efficient protocols and endpoint addresses.

Even if one focusses on one of the most politically popular of the internets based on IPv4, the only real measurement of the topology by CAIDA is fuzzy to say the least, and comprising roughly 50 million routers, 150 thousand links, and 50 thousand autonomous systems. As CAIDA notes, it is also highly U.S. — centric.

By comparison, the GSM global mobile internet currently has nearly 9 billion connected devices and 5 billion users and growing at an exceedingly fast rate — offering substantial openness at higher security levels.

Open Internet

An obvious consequential question is how the term "open internet" originated and why it persists. The phrase is generally associated with the Obama Administration's Net Neutrality initiative — which itself was largely initiated by Over the Top (OTT) providers that rely on the DARPA internet's U.S. centricity to pursue offshore markets — especially for mining available information and directly reaching end users. The term's use was almost unknown prior to 2008, although it was the political successor to the Clinton Administration's Internet Freedom strategy.

In Washington lobbying circles, the term has been co-opted by almost everyone as a kind of political mantra without ever explaining details — even with the reversal of FCC Net Neutrality policy. Oddly, the U.S. State Department is still promoting the term abroad - even as the Trump Administration's Net Neutrality policy has changed domestically. However, incongruity is a stable of life in Washington and no one expects intellectual or policy consistency.

In the European Union, the term Open Internet is tightly bound to Net Neutrality, and has special significance in efforts to bring about a common market. Elsewhere in the world, it is not apparent that anyone really cares, as the global mobile internet infrastructure is more important.

What has changed?

Over the past several years, the exponential increase in the placement of malware and exfiltration of sensitive information via open networks should have re-vectored the Open Internet rhetoric. The DARPA Director who approved its R&D internet development in the 1970s began sounding the alarm in a continuing series of initiatives and papers to senior U.S. DOD officials beginning in the late 1990s. Evgeny Morozov at the political level began raising concerns in 2011 with his famous book, "Net Delusion: The Dark Side of Internet Freedom." The WikiLeaks Assange activism about the same time should also have been a wakeup call as to the ease and harms of massive data exfiltrations. Even the theft of the U.S. government's OPM clearance data failed to lessen the fervor for open internets.

However, it appears that a series of recent events in the U.S. have finally begun to heighten concerns about the dark side of internet openness. This began with the revelation in 2016 that Putin hoisted the U.S. on its own open internet petard in actively intervening in the U.S. elections and the U.K. Brexit vote. The confirmation by the Mueller indictments of FSB and GRU officers underscored the clear and present danger to the most critically important, existential governance of the nation. Subsequent events have amplified the concern with the emergence of neo-Nazi social media sites bringing about the mass murder at a Pittsburgh Synagogue and Facebook's being co-opted in active political influence campaigns as a service.

So today, the Open Internet mantra is a hard sell — especially to foreign countries who likely have no interest in suffering the same experiences of the U.S. The mantra should be discarded and re-focused on providing something of considerable current value - effective cyber defense for all communication networks. In many cases, that requires internets that provide effective cyber defense at network gateways and considerably greater attention to the threat vectors like the so-called Pervasive Encryption protocols that exacerbate data exfiltration and malware placement. In many cases involving critical infrastructure, it means ensuring totally closed internets.

If a communication freedom mantra is needed, political leaders should return to the proven legacy norms such as "reachability" and "universality."

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation

Categories: News and Updates

An In-Depth Interview of OneWeb CEO Greg Wyler

Fri, 2018-11-16 19:00

Greg Wyler, Founder and Executive Chairman, OneWebOneWeb is building a large constellation of low-Earth orbit (LEO) Internet-service satellites and Via Satellite has published the "definitive 2018" interview of OneWeb CEO Greg Wyler. The following are some of the quotes that caught my eye:

  • The system has been designed. The satellites have been tested. They are going through the final stages of testing now before the launches begin. The satellites have actually performed better than expected in many ways, especially with their Radio Frequency (RF) performance which is really positive.
  • I think we will have customers up and running in 2020.
  • Whether [our satellites] are $500 thousand (the estimate in 2015) or $1 million is virtually irrelevant because what they are not is $50 million, and that is where it started.
  • The initial customers will be in the mobility and emergency services markets (paraphrase).
  • Aviation is a big [market] for us.
  • Why not let Sprint, DT, roam onto the plane? You can give the customers 4G/5G on the same devices they are used to using in their car, at the gate, or in other places.
  • The plane itself can become a Local-Area Network (LAN) party! I have been in aviation my whole life so this is always something I have been interested in.
  • OneWeb with its first constellation will be able to make a big impact on health centers and schools.
  • I would like to keep [the number of satellites up in five years time] below 1,500.

The tone of the interview was positive, but the early emphasis on emergency and mobile services (where they will have competition from other, relatively focused LEO satellite companies like Telesat and Leosat) makes me wonder whether their goal of eliminating the digital divide by 2027 might be slipping.

If I could have asked one question, it would have been about the objection to OneWeb that has been raised by the Russian Federal Security Service (FSB). If the FSB succeeds in stopping OneWeb in Russia, they will lose access to a potential market. Furthermore, it would jeopardize their contract for 21 launches with the Soviet space agency Roscosmos and perhaps cost and delay the project.

This has been a quick summary of a long interview — you should check out the full interview.

Written by Larry Press, Professor of Information Systems at California State University

Follow CircleID on Twitter

More under: Broadband, Telecom, Wireless

Categories: News and Updates

SpaceX Wins FCC Approval to Deploy 7,518 Satellites for Broadband Communications

Thu, 2018-11-15 19:27

Space Exploration Technologies Corp. is granted permission from U.S. regulators to deploy over 7,000 satellites. The company has two test satellites already in space and was also granted a separate permission for 4,425 satellites. All satellites are designed to provide broadband communications. Todd Shields reporting in Bloomberg today writes: "Space companies riding innovations that include smaller and cheaper satellites — with some just 4 inches long and weighing only 3 pounds — are planning fleets that will fly fast and low, offering communications now commonly handled by larger, more expensive satellites. Right now there are fewer than 2,000 operating satellites… The agency [FCC] on a 4-0 vote advanced rules to require more calculations to demonstrate a planned spacecraft poses a minimal risk of collisions, and to minimize new orbiting debris — for instance, from devices that remain aloft after releasing a satellite."

Follow CircleID on Twitter

More under: Access Providers, Broadband, Policy & Regulation, Wireless

Categories: News and Updates

Neglected Domain Renewals Increasingly Scooped Up by Crooks for Credit Card Stealing Purposes

Tue, 2018-11-13 23:09

The registrant of domain names with decent traffic who fail to renew them are proving quite costly for owners and others. Brian Krebs reports: "Lately, neglected domains have been getting scooped up by crooks who use them to set up fake e-commerce sites that steal credit card details from unwary shoppers. ... Credit card data stolen by these various Magecart groups invariably gets put up for sale at online cybercrime shops… In addition, some Magecart actors will sell access to hacked online stores, allowing crooks who buy this access to receive a live feed of freshly-stolen payment card details for as long as the site remains compromised."

Follow CircleID on Twitter

More under: Cybercrime, Domain Management, Domain Names

Categories: News and Updates

US, Russia and China Stay Out of Paris International Cybersecurity Pact

Tue, 2018-11-13 22:37

US, China and Russia have refused to sign the French-backed agreement, Paris Call for Trust and Security in Cyberspace, announced by French President at the UNESCO Internet Governance Forum (IGF) on Monday. Olivia Beavers reporting in The Hill: "Despite the Trump administration withholding its support, U.S. companies like Microsoft, Facebook, and Google, as well as over 100 other companies, joined 51 countries in signing onto the Paris Call. The agreement includes cyber principles that aim to limit offensive and defensive cyber weapons, including protecting civilians from cyberattacks, curbing hate speech, and deterring election interference by other foreign nations."

Follow CircleID on Twitter

More under: Cyberattack, Cybersecurity, Internet Governance, Policy & Regulation

Categories: News and Updates

Nigerian Cable Company Takes Blame for Misrouting Google's Global Traffic Through China

Tue, 2018-11-13 21:15

Much of Google's traffic yesterday appeared to be re-routed through Russia and dropped at China Telecom. The issue raises serious concerns as a possible traffic hijacking incident but later linked to a network misconfiguration by a firm in Nigeria. Reuters reports: "Nigeria's Main One Cable Co took responsibility on Tuesday for a glitch that temporarily caused some Google global traffic to be misrouted through China, saying it accidentally caused the problem during a network upgrade. ... Main One said in an email that it had caused a 74-minute glitch by misconfiguring a border gateway protocol filter used to route traffic across the internet. That resulted in some Google traffic being sent through Main One partner China Telecom, the West African firm said."

"This incident further underscores one of the fundamental weaknesses in the fabric of the Internet," says ThousandEyes' Ameet Naik. ThousandEyes, a network monitoring firm, was one of the first companies to raise the alarm on Tuesday after noticing traffic to Google was getting dropped at China Telecom. Naik writes: "BGP was designed to be a chain of trust between well-meaning ISPs and universities that blindly believe the information they receive. It hasn't evolved to reflect the complex commercial and geopolitical relationships that exist between ISPs and nations today. ... Even corporations like Google with massive resources at their disposal are not immune from this sort of BGP leak or malicious hijacks. MainOne took 74 minutes to either notice or be notified of the issue and fix it, and it took about three-quarters of an hour more for services to come back up. Most enterprises who don't have Google's reach and resources may not be able to resolve the issue as quickly, which can significantly impact business."

Follow CircleID on Twitter

More under: Networks

Categories: News and Updates

Has President Macron Thrown Multistakeholderism Under the Bus at UN IGF 2018 Paris?

Tue, 2018-11-13 17:14

Today, President Macron threw down the gauntlet to President Trump and the US administration on Multistakeholderism.

In his welcome address to IGF 2018 Paris a few hours ago, President Macron challenged IGF to become more relevant by reinventing itself in factoring in multilateralism into IGF's non-decision-making body and to move beyond the mere talk-ship lip service it has been for the last 13 years.

Macron stated that he (and France) will support a new direction by IGF under the direction of the UN Secretary General, if it decides to factor in multilateralism. He argued that with the new global cyber threat landscape requiring regulation and legislation, such a step is inevitable if IGF is to remain relevant.

Was Macron reminding his "ami" in the White House that his nationalistic divisive stance towards the world and Europe and the actions of previous administrations have not been forgotten, and that they carry a very high price?

Today was the 2nd time US domination of the Internet through Multistakeholderism has been challenged by Europe. Only a few months ago, the 1st challenge came in the form of the EU's GDPR, when it became law on May 25, 2018. The GDPR has been a serious challenge to ICANN and its core role and mission, on many fronts.

What effect will the Macron speech have on the new religion of Multistakeholderism? It remains to be seen how this doctrine, created by the US back during the UN WSIS days (2001-2005), will fare.

Multistakeholderism's goal, for the record, has always been to promote internet self-regulation, and to keep all barriers to the internet around the world very low. This all under the banner of "openness" and "freedom of speech". Governments who legislated (or even considered) the blocking or monitoring content deemed inappropriate, including porn or worse, were often criticized and labelled as oppressors of freedom and democracy. This happened while American tech giants continued to grow at an unprecedented pace, and dominate the Internet. Just recently the world saw its two 1st ever trillion-dollar companies which happen to be American: Amazon and Apple. Meanwhile, no European company is in the top 10 world ranking.

Was Macron paying the US back for PRISM and the Snowden revelations that exposed to the world the unprecedented and illegal mass surveillance by the US security agencies? Not to mention similar abuses by UK's GCHQ, in snooping on Americans and the world under the banner of fighting terrorism?

Macron did state in his speech that France and Germany (who is next year's IGF host) are on the same page. So was he stepping up to support the German leader Merkel, herself one of the targets of PRISM when the U.S. snooped on their friends, allies and enemies alike?

It will be interesting to start reading and observing the usual devout voices of Multistakeholderism when they begin calling on IGF to resist Macron's sacrilegious call and to keep IGF a talk-shop that cannot conclude or make decisions.

Yes, we do live in interesting and unprecedented times. It is entertaining to watch smart politicians fire at each other, with different ammunition at different targets and on different battlegrounds while continuing to sound like collegial, friendly and cooperative.

Seems to me like Macron just hit back at Trump on his disparagement of the NATO Budget, the Iran Nuclear deal, and the Paris climate change accord. Now, this is making IGF more interesting.

Let's see if Trump is smart enough to realize by himself that his 'ami' Macron just fired a shot across his bow that ruffled his hair.

Written by Khaled Fattal, Group Chairman, Multilingual Internet Group & Producer "Era of the Unprecedented

Follow CircleID on Twitter

More under: Cybersecurity, Internet Governance, Law, Policy & Regulation, Privacy

Categories: News and Updates

Verisign's Attempt to Increase its Fees Still Unjustified Despite Diversionary Tactic

Tue, 2018-11-13 17:02

Shortly after the National Telecommunications and Information Administration (NTIA)'s recent announcement allowing Verisign to pursue increased .com registry fees, Verisign published a blog post questioning the business practices of registrars and domain name investors. The ICA, on behalf of its registrar and domain name investor members, had previously spoken out against a .com fee increase, as did others in the domain industry. Rather than justify a fee hike, Verisign attempted to shift the community's attention elsewhere. Yet the issue remains that the fee cap currently in place was put there for good reason: because there was no justification for any increase and because the public needed protection from excessive fees. Higher fees for Verisign are not justified, as demonstrated by considering these four key points:

  • Verisign is a provider of technical registry services, it does not own the .com name space;
  • Unlike Verisign's fees, the prices set by registrars and domain name investors are held in check by competition;
  • Verisign is already well-paid for its services, as evidenced by its substantial profits;[1] and
  • ICANN need not approve fee hikes; on the contrary, ICANN ought to assert its right to set reasonable fee levels for its hired registry manager.

Verisign's Role

1. Verisign is a Manager, Not the Owner of the .com Registry

Operating a database in a reliable and secure fashion is the primary job of a domain name registry, and by all accounts, Verisign has done that well. ICANN, the ultimate owner of the .com registry, contracted with Verisign to operate the registry as a service provider. ICANN did not relinquish ownership of the .com name space to Verisign.

The role of a domain name registry manager is comparable to that of a land registry manager. The registry manager, whether the registry is for land or for domain names, is entitled to a fee — nothing more. Verisign is not entitled to treat the .com name space as its corporate asset, any more than the land registry manager is entitled to treat the land that it enters into its records as its own. The wholesale price[2] of domain names accordingly is not a matter for the manager to determine, and its fees should never be equated with the wholesale price which is a determination that should be exclusively made by the owner of the name space.

In 2006, in response to a lawsuit brought by Verisign, ICANN granted Verisign the .com registry management contract with a presumptive right of renewal and without a competitive bidding process. This agreement was, and remains, quite controversial, as it is clear to many observers that it is extremely one-sided in Verisign's favor.[3] Yet while the ICANN Board granted Verisign what many considered a sweetheart deal to operate the .com registry,[4] Verisign's role continues to be that of a manager, not an owner. The .com name space, as one of the original legacy extensions, is therefore fundamentally different from a new gTLD which is created and paid for by a private company. When it comes to the .com registry, whose existence pre-dates Verisign's involvement, it is the ICANN community as the owner which is entitled to set fees, not its hired manager.

2. Verisign's Fees

Verisign is entitled to a reasonable fee for running the registry. Verisign currently charges $7.85 per .com domain name per year. The cost to operate the registry, after taking into account the expensive infrastructure necessary to provide reliable and secure operations and high overhead, has been estimated at $1.00[5] to $3.50[6] per domain name per year. Verisign enjoys an enviable and highly lucrative operating margin of 60%[7]. If the proposed fee increases go into effect, by the end of the six-year agreement term, the fee for each .com domain name will increase to $10.29 per year, a 30% jump from current levels.[8]

Despite benefiting from the economies of scale from operating the largest domain name registry together with the .net registry, Verisign already charges far more for its registry services than other registry operators. For instance, as reported by Domain Name Wire, a recent bid to manage the registry for India's .in domain name was made by Neustar at just 70 cents per domain name per year, and registry operator Afilias offered to run it for $1.65 per domain name per year. Apparently, other qualified operators could manage the .com registry for far less than the $7.85 per domain name per year that Verisign charges, resulting over time in billions of dollars in cost savings to the registrants of the approximately138 million registered .com domain names.[9]

The owners of other name spaces, in particular, the authorities in charge of certain national domain name (ccTLD) extensions, are not shackled by a one-sided agreement that gives one company the right to manage their registries in perpetuity. These authorities have more flexibility in their arrangements with their registry operators, including the right to open the registry management contract to bid and the right to replace their registry manager. When the .fr authority put the .fr registry contract out for bid it resulted in lower fees to run the registry,[10] while open bidding for the management of the .au registry resulted in a change of registry operator and expected cost savings.[11]

The ICANN community is now able to benefit from knowing the results of competitive bids to run other registries. Although Verisign has a presumptive right of renewal, ICANN has a say in setting the fees that Verisign charges. ICANN can and should set the fees for operating the .com registry at the market determined level for these services: profitable for the manager, but not exceedingly so. Accordingly, not only is hiking the fees for .com unjustified, lowering the fees appears warranted.

3. Verisign is Not Entitled to Take Entire Credit for .com's Success.

The .com extension was widely adopted as the de facto address for the commercial Internet long before Verisign was awarded the privilege of operating the .com registry. In the United States, the vast majority of the top commercial websites use .com domain names. The billions of dollars spent by companies promoting their .com websites has firmly entrenched .com in the minds of consumers as the pre-eminent and most sought-after Internet extension. Verisign ought not to reap outsized rewards which are disproportionate to its role as an effective and competent manager.

4. Registrants Would Have Little Choice but to Pay Higher Fees

If a business that is already established on its .com domain name is unhappy with the higher renewal fees, moving to a different domain name would be highly disruptive and expensive, if not entirely impractical. All of the prior advertising promoting a business' .com domain name, all the existing links, and all its email addresses, would need to be changed. All the goodwill that a company had developed around its .com brand would be lost. Verisign, therefore, benefits from having a substantially "captive audience" that is generally forced to absorb any and all fee increases which Verisign decides to impose. Accordingly, there should be no consideration of any increase in .com registry management fees in the absence of clear and compelling justification for raising them on a substantially captive market.

5. .Com Registry Fees Are Not Constrained by Competition

Because of the unique role .com plays in the Internet Economy, the availability of other extensions is not sufficient to constrain .com registry fees. The .com extension enjoys a dominant position as the brand extension for much of the commercial Internet. Most of the world's best-known companies and most heavily-visited websites operate on a .com domain name. This has created an expectation in the minds of consumers, especially in the United States, that a business website will be found on a .com domain name.[12] Globally, .com has also become the preferred extensions for companies operating internationally, as unlike country code extensions, it is not tied to any one particular country. Indeed, as Verisign's own marketing proclaims, .com is "revered as the global online standard".[13]

In part, because .com offers unique branding value, it does not face effective competition. If it were the case, as NTIA appears to claim, that lifting the fee freeze is justified due to the development of a "more dynamic DNS marketplace",[14] one would reasonably expect that such purported genuine competition would compel Verisign to lower fees, not raise them.[15] Yet even the availability of hundreds of other new domain name extensions, often available for far lower registration and renewal fees than .com, has had little significant impact on the continuing and burgeoning demand for .com domain name registrations.[16]

It is therefore misguided to believe that market forces will naturally constrain fees to register or renew .com domain names. The .com extension offers unique benefits, and those desiring those benefits can only obtain them by selecting a .com domain name.[17] Furthermore, the large installed base of .com websites has little choice but to pay whatever fee is charged to renew their domain names, regardless of whether that fee has any bearing on the actual costs incurred by Verisign in running the .com registry. If Verisign is wrongly permitted to act as the de facto owner of the .com name space, it will be able to seize for itself the windfall profits that come from controlling a monopoly — the largest, most lucrative and most dominant name space that has evolved on the Internet.

Domain Investors and Registrars Operate in a Competitive Marketplace

In its aforementioned blog post, Verisign made some unfortunate suggestions that domain name investors and registrars who participate in the domain name aftermarket may be considered "scalpers", and that domain investors' businesses are "questionable". The domain name aftermarket, to the contrary, is entirely lawful, highly competitive, and any profits earned are the result of successful investment in a free and open marketplace.

1. Investment Lawfully Exists in Every Marketplace

Whether in land, a catalogue of Beatles songs,[18] or domain names, investing in assets is a natural by-product of a free and open market. Domain registrants use and risk their own money to lawfully purchase generic and descriptive domain names on a first-come, first-served basis and from prior owners, and they have every right to continue to do so. Domain name investors range from an at-home mom making a casual investment in a handful of names, to a top branding agency that offers for sale thousands of domain names that were registered as a by-product of brainstorming new name candidates for clients,[19] to professional domain name investors who spend substantial money and efforts on building a portfolio and marketing it to the public. Domain names are also sold by companies large and small who originally registered the domain names for future development, defensively, or because having a valuable .com domain name is helpful for their business. Such business activities involving domain names are entirely legal, expected, and natural. There is nothing "questionable" about it. As one commentator recently put it, a domain name investor is engaging in no more "questionable" business activity than a Verisign shareholder who purchases stock in the hopes of reselling it for more than they bought it for and for whatever the market will bear.

Domain name investors risk their own capital to register or purchase a domain name with no guarantee that they will ever see a return on that investment. Domain investors compete with thousands of other market participants around the globe, seeking out desirable domains, and bidding against each other at auctions where the price is set by the market through the combined actions of thousands of participants. Many domain name investors lose money on their acquisitions, as they find that they have overpaid to acquire domain names that others do not regard as an attractive investment or which others do not want.

Investing in valuable generic and descriptive domain names is comparable to investing in vacant real estate. Both investments are made on the basis of an expectation that there will be an appreciation in value upon resale. A businessperson who wishes to open a storefront on 5th Avenue in New York City would expect that land to be already owned. Similarly, it should not come as any surprise that a valuable domain name already has a registered owner, whether it be a professional domain name investor or another kind of business, and that the owner is prepared to sell it at a market-determined price.

2. Professional Domain Investors Control an Estimated 10% of .com Domains

The best estimates are that the holdings of professional domain name investors represent approximately 10% of all registered .com domain names. The other estimated 90% of .com domain names are held by individuals, small businesses, and in the portfolios of large corporations. Those estimated 90% of domain name registrants may never interact with the domain investment community, and may never purchase an aftermarket domain name, but 100% of .com domain name registrants may soon be subject to higher registration and renewal fees from Verisign. As such, if Verisign were permitted to raise the fees for .com domain names, most of the burden would fall on the vast majority of .com registrants who are not domain name investors. It is not enough to excuse a fee increase by saying, "well, it's only a dollar or two more" that registrants are being charged, when that dollar or two more, across the entire class of .com registrants, adds up to billions of dollars in excessive fees.

3. Domain Name Investors Offer a Valuable Service by Providing Liquidity to an Illiquid Market

Domain names are notoriously illiquid investments. The holding period of domain names held by domain name investors can stretch into decades. Yet if an individual or a company wishes to immediately sell a domain name, it is the investor who steps up to provide a ready market and liquidity. If, for instance, a retiring couple who used a valuable generic domain name for their business and now wished to sell it since it was no longer needed has trouble finding an interested end-user buyer, domain name investors will often step in, bid against each other for the right to acquire the domain name, and thereby create a liquid market enabling the couple to quickly convert their domain name into cash. When Yahoo! wished to sell its domain name, it was put up for auction at a domain investor conference where the winning bidder paid $380,000.[20] Domain name investors allow domain name owners to readily obtain cash for domain names that they no longer need, or may otherwise wish to sell.

4. Domain Name Pricing and Availability Would be Little Different Even in the Imagined Absence of Domain Name Investors

Domain name investors do not set the market value of aftermarket domain names nor do they determine which domain names are desirable — the operation of a competitive marketplace does. Prices and desirability are dictated by the market. If the asking price is set too high, a buyer can choose from a variety of similar domain names available at a range of prices.[21] If a domain name is desirable, it would have been registered long ago even in the absence of domain name investment.

Verisign proposes an unrealistic scenario in which, in the imagined absence of domain name investment, valuable domain names such as (bought for $3,500,000) or (bought for $1,200,000)[22] could be obtained at a standard registration cost of around $10 each. Even if professional domain name investors vanished, high-quality domain names would not be sitting unregistered and the owners of these domain names would seek the market value for them.

For instance, Procter & Gamble was one of the companies in the early days of the Internet with the foresight to register valuable generic dot-com domains, such as the kind favored by domain investors. When they went to sell some of their domain names, such as, and, P&G;was surprised by the high value of those domains in the secondary market. P&G;did not offer those domains for sale at its cost, nor did anyone expect them to.[23] Similarly now that electrical engineer Marcelo Siero has decided to sell the domain name, which he registered 24 years ago, he is seeking a market price in the millions of dollars,[24] and who would fault him for that?

Twenty years into the evolution of the commercial Internet, and after over 100 million .com domain name registrations, there are almost no domain names of general value sitting unregistered. The net impact on .com domain name availability due to the presence of domain name investors is that domain name investors have registered millions of lower-quality, less desirable domain names that otherwise might have gone unregistered.[25]

5. Verisign Encourages Domain Investing and Has Benefited Greatly From It

Verisign, throughout its history, has encouraged people to invest in domain names and has been a primary sponsor at conferences focused on domain name investing. Verisign had good reason to do so, as domain name investors send tens of millions of dollars to Verisign annually from registering millions of domains that no one else has shown much interest in. The business most benefiting from these registrations is Verisign itself, which receives $7.85 per domain name per year while taking no risk, whereas the domain name investor often loses money on his or her investment.[26]

6. Verisign Sells Domains at Premium Prices

During the "land rush" for the Japanese[27] and Korean[28] versions of its .com domain names, Verisign unilaterally set initial prices on certain keyword domains at over $10,000 each.[29] For instance, Verisign set a premium price on the Japanese version of <>, which is ブログ.コム. This domain is currently available for first-time registration at a price of $15,000 from[30] Our understanding is that while a portion of this price represents a retail mark-up charged by, the base price set by Verisign was over $10,000. Verisign's criticism of speculators for offering premium domain names at market prices, when Verisign engages in comparable sales, therefore appears to be extremely contradictory.


NTIA ostensibly made a regrettable political calculation that the profits of a single U.S. corporation outweighed the interests of millions of .com registrants across the globe including millions of Americans. The ICANN Board need not follow the lead of the Trump Administration's NTIA on this issue. Indeed, it would be a dereliction of the ICANN Board's responsibilities to the ICANN community to do so. Verisign was hired to do a job — run the .com registry. The ICANN Board can allow Verisign to effectively determine how much to pay itself for providing this service, or the Board can fulfill its responsibilities as the owner of the registry, by protecting the global community of registrants from unjustified fee increases.


[2] The wholesale price is the price that registrars pay.


[4] See for example,; and also see


[6] See

[7] See;





[12] "buying an exact .com domain makes finding your website much easier for customers and lends invaluable prestige to your company." (



[15] For another perspective on the dubious rationale for lifting the fee freeze see:

[16] See:

[17] "The problem with not having the .com of your name is that it signals weakness… a marginal domain suggests you're a marginal company.", see:

[18] "McCartney told Jackson about how he had been purchasing other artists' catalogues (such as Buddy Holly's) as a business investment.", see:

[19] See


[21] "Finding the right URL can be daunting, but with a little time and effort you can absolutely find a domain that works for your brand — and your wallet!", see

[22] Source

[23] See


[25] "While buying up a ton of domains seems like a great way to make some extra money, the real world results show that it is very hard to make that process profitable."; see:






Written by Zak Muscovitch, General Counsel, Internet Commerce Association

Follow CircleID on Twitter

More under: Domain Names, ICANN, Policy & Regulation, Registry Services, New TLDs

Categories: News and Updates

A Less Than Candid PP-2018 OTT Resolution

Sun, 2018-11-11 20:46

As the ITU-T 2018 Plenipotentiary Conference rolls toward a close this week, its most controversial and contentious subject appeared baked into a new treaty instrument resolution that has apparently reached a kind of steady-state. After distilling the many input proposals through ten revisions and a corrigendum, the tasked drafting committee has produced a new resolution with the simple title of "OTTs."

Bear in mind that Over-the-Top (OTT) encompasses any and all communication networks and services — including those using ephemerally encrypted tunnels — that can be instantly and massively instantiated across the world. OTTs can — and have been — used to plant malware, to create massive network attacks, to create phony accounts to change election outcomes, to establish extremist groups to massacre religious worshipers, to surreptitiously exfiltrate user data, to remotely control IoT devices. The list of significant OTT threats to peoples, societies, and nations is lengthy.

It is preposterous understatement to suggest there are international "considerations" and "challenges." However, that is about as close as the OTT Resolution gets to point out (using the Apollo 13 phrase) "Houston, we have a problem here."

One can only imagine the dialogue in Dubai where a national delegate tries to sell the benefits of OTTs to another country, and they are reminded that the USA elections were hacked from an adversary nation via OTTs. It is worthy of a Saturday Night Live skit.

Plainly, OTTs have potentially profound adverse impacts on national sovereignty, legal systems, network services compliance obligations, network attacks, human rights including privacy, cybersecurity, wholesale data theft, fraud and consumer protection, cybercrime, national security, protection of cyber-physical infrastructures. Yet, none of these "red team" adverse consequences are found in the OTT Resolution.

Still, in the end, the new OTT Resolution seems to be on the way to adoption, and it does call for extensive continuing "collaboration and dialogue." What the resolution doesn't get quite right, however, is that the transition occurring is not from "legacy to an IP-based ecosystem," but rather to a NFV-SDN/5G ecosystem of network slices. The reality today is that it is the ancient and vulnerable IP platform that is "legacy."

Only China seems to have taken the long view in its OTT related proposal — solving OTT challenges in a manner that avoids complete Balkanization will require a set of new International Telecommunication Regulations that establish transnational OTT "rules of the road". Huffy-puffy nationalism doesn't work well when selling OTT services to other countries. In the meantime, the topic of extraterritorial OTT instantiation is sure to dominate the work of many bodies and people. Some will be pursuing their interests in expanding and exploiting global markets as cheaply and as fast as possible. Others will be more judicious and concerned about the potentially profound adverse effects on their nation and users. It will be a bonanza for public international law scholars.

So, as the Conference comes to a close this week, the final tweaks to the OTT Resolution, as well as the Reservations and Counter-Reservations of the signatory Nation States, will make interesting reading!

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Internet Governance

Categories: News and Updates

Schneier: Lasting IoT Security Will Only Happen if Governments Start Introducing Stiff Penalties

Sun, 2018-11-11 19:57

Without regulation, there is little hope companies will implement proper security protection measures for IoT devices, said author and security expert Bruce Schneier, during a panel discussion at the Aspen Cyber Summit. Schneier via a report by Shaun Nichols in The Register: "Looking at every other industry, we don't get security unless it is done by the government. I challenge you to find an industry in the last 100 years that has improved security without being told [to do so] by the government. ... I don't think people are going to say I'm going to choose my refrigerator based on the number of unwanted features that are in the device."

Follow CircleID on Twitter

More under: Cybersecurity, Internet of Things, Policy & Regulation

Categories: News and Updates

Protecting Privacy Differently

Sat, 2018-11-10 19:40

My thesis is simple: the way we protect privacy today is broken and cannot be fixed without a radical change in direction.

My full argument is long; I submitted it to the NTIA's request for comments on privacy. Here's a short summary.

For almost 50 years, privacy protection has been based on the Fair Information Practice Principles (FIPPs). There are several provisions, including purpose specification and transparency, but fundamentally, the underlying principle is notice and consent: users must be informed about collection and consent to it. This is true for both the strong GDPR model and the much weaker US model: ultimately, users have to understand what's being collected and how the data will be used, and agree to it. Unfortunately, the concept no longer works (if indeed it ever did). Arthur Miller (no, not the playwright) put it this way:

A final note on access and dissemination. Excessive reliance should not be placed on what too often is viewed as a universal solvent — the concept of consent. How much attention is the average citizen going to pay to a governmental form requesting consent to record or transmit information? It is extremely unlikely that the full ramifications of the consent will be spelled out in the form; if they were, the document probably would be so complex that the average citizen would find it incomprehensible. Moreover, in many cases, the consent will be coerced, not necessarily by threatening a heavy fine or imprisonment, but more subtly by requiring consent as a prerequisite to application for a federal job, contract, or subsidy.

The problem today is worse. Privacy policies are vague and ambiguous; besides, no one reads them. And given all of the embedded content on web pages, no one knows which policies to read.

What should replace notice and consent?  It isn't clear. One possibility is to use controls: users specify for what their information can be used, rather than who can collect it. But use controls pose their own problems. They may be too complex to use, there are continuity issues, and — at least in the US — there may be legal issues standing in their way.

I suspect that what we need is a fundamentally new paradigm. While we're at it, we should also work on a better definition of privacy harms. People in the privacy community take for granted that too much information collection is bad, but it is often hard to explain to others just what the issue is. It often seems to boil down to "creepiness factor".

These are difficult research questions. Until we have something better, we should use controls; until we can deploy those, we need regulatory changes about how embedded content is handled. In the US, we should also clarify the FTC's authority to act against privacy violators.

None of this is easy. But our "data shadow" is growing longer every day; we need to act quickly.

Written by Steven Bellovin, Professor of Computer Science at Columbia University

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation, Privacy

Categories: News and Updates

IGF 13 &amp; Paris Peace Forum: Europe Should Take Lead in Shaping a "New Deal" on Internet Governance

Fri, 2018-11-09 20:12

Co-authored by Wolfgang Kleinwächter [1], Matthias C. Ketteman [2] and Max Senges [3].

The development of the Internet has arrived at a new Crossroads. The growing Internet Governance complexity is leading also to a higher level of confusion on how the digital future should be shaped. The French president Emanuel Macron and UN Secretary General Antonio Guterres will open both the Paris Peace Forum and the 13th IGF where Internet Governance is a key issue. Is the time ripe for a "New Deal" on Internet Governance? And which stakeholder should bear the primary responsibility for the normative framing of the key challenges internet governance is facing?

Recently, APNIC's Geoff Huston asked here whether Internet Governance had become irrelevant. We give the same answer as he does: No. But we identify a new player in the Internet Governance matrix, one with a long history of rule of law promotion that has been in recent years less active than it should be: Europe. As a political actor, its legitimacy as a force for ensuring the reign of rule of law in international relations is unparalleled. It also packs an economic punch. In data protection and the fight against cybercrime, European norms have been successfully globalized. We argue that the time has come for Europe to become a leader in a process towards a "New Deal" on Internet Governance.

Why Internet Governance is in crisis

Why — and why now? Simply put, the actors, normative instruments and processes of internet governance are in crisis. There is a growing gap between discussions and decisions. There are growing inability and unwillingness of governments to compromise on global Internet-related public policy issues. And there is growing rhetorical battle which has become more and more gladiatorial. It is a battle about dominance and control where walls are erected and bridges are burned.

In such an environment, Europe should take up the charge of moving internet governance forward in the light of its liberal, human rights-based values, prior legal commitments, normative pedigree in areas such as data protection and the fight against cybercrime, and powerful economic potential — and it should do so in the best form and forum possible, through coordinated multi-level activities in an invigorated IGF.

As the internet governance community seems evenly divided between the two argumentative hubs centered on dogmatic interpretation of norms and those arguing for a more technology-oriented reading of international law, we are faced with four dimensions of dysfunctionality plaguing current internet governance approaches. These are generalizations, but they hold true in essence:

  1. Internet governance actors do not substantially cooperate in all areas of governance, but increasingly pursue partially narrow self-interests. (Europe can provide a credible alternative approach by showing how global commons-oriented internet governance policy is better suited to develop an order where rights and goods are fairly distributed and political authority is checked).
  2. Relatively new normative processes, such as the UNGGE, break down and existing ones, such as the IGF, seem to stall (which is why the IGF needs to be reinvigorated).
  3. Subsequent generations of different normative instruments 'principles' and 'norms' — fail to convincingly alter the behavior of actors in light of cybersecurity threats. Those that convince normatively, such as a norm to protect the internet's public core, are too narrow to substantially influence internet governance policies on a macro level. (This is why Europe's normative pedigree is an important source of legitimacy for norms).
  4. There are alarming tendencies to 're-silo' internet policies, that is to treat trade as unconnected to, say, human rights or cybersecurity by using a sectoral approach. (This is why a holistic approach is needed — and Europe can provide it.)

For years the global Internet Governance discussion was overshadowed by debates between multilateralists and multistakeholderists. But today, the conflict dynamics have changed in tone and direction. The new threat to the globality of the internet is a new unilateralism coupled with populist illiberalism and neo-nationalism. The recent "Freedom of the Net Report 2018" has called "Digital Authoritarianism" as the biggest attack against a free, open, secure and unfragmented Internet. The failure of the UN Group of Governmental Experts (UNGGE) in the field of cybersecurity in June 2017 as well as the inability of the World Trade Organisation (WTO) to draft a universal framework for global digital trade in December 2017 indicated that the road back to constructive multilateral negotiations on Internet-related public policy issues will be a long one.

But even if this road may be a long one, it has to be based on the existing and re-confirmed commitment to international law and human rights by all UN member states and the recognition of the multistakeholder approach. The 2013 UNGGE report re-confirmed the relevance of the UN Charter for the cyberspace. 2015, the UN Human Rights Council re-confirmed the relevance oft he UN Universal Declaration of Human Rights and stated that individuals should have the same human rights online as well offline. 2017, the G20 re-confirmed the relevance of the multistakeholder approach to Internet Governance.

However, what we have seen since that are rather different interpretations of those instruments and concepts: Different governments and different stakeholders have different ideas how to implement those commitments in cyberspace if it comes to concrete political decisions related to cybersecurity, digital trade, freedom of expression or privacy. Some governments put national interest first, others recognize the global nature of the Internet and the need to find common solutions.

This new controversy between "neo-nationalism" and "globalism" is polluting now all debates. It can be seen in the discussions how to stabilize the internet's core architecture, how to react to new challenges to cybersecurity which come with backdoors in Internet technology and autonomous lethal weapon systems, how to keep the internet open and to save online freedom, how to promote sustainable digital development, manage digital trade and the future of work, how to frame the development of the Internet of Things and Artifical Intelligence etc.

All this is on the agenda of numerous discussions and negotiation platforms as the IGF, ITU, UNESCO, ILO, UNCTAD, ICANN, IETF, W3C, G7, G20, BRICS, WTO, OECD, Council of Europe, ASEAN, African Union, the UN Secretary General's High Level Panel on Digital Cooperation (HLP.DC) and many other state and non-state bodies.

The multiplication of negotiations platforms mirror the growing Internet Governance complexity, but this new Internet Governance complexity has led also to a new Internet Governance confusion. Very often, the right hand has no clue what the left hand is doing in the digital world. Even within governments, different ministries have different ideas about how to deal with the world 's one Internet.

To avoid that the confusion leads to controversies which go out of control and end in a digital catastrophe, a new big deal on how to balance the conflicting interests and to channel the controversial discussion into a process towards cybersecurity and sustainable digital development by respecting human rights is more than needed. Such a "New Deal" must include all stakeholders and has to be based, as outlined above, on the existing international legal system, but it has also to introduce some political innovations.

Europe's window of opportunity

Europe, which was rather silent in the Internet Governance discussion during the 2010s, could make a substantial contribution to such a "New Deal". Europe's strength is the rule of law. European institutions — from the Council of Europe with the European Court of Human Rights in Strasbourg to the institutions of the European Union with the European Parliament, European Commission and European Court of Justice in Luxembourg have produced instruments and offer procedures which make clear that cyberspace is ruled by law.

Historically, data protection law was a European concept that successfully migrated internationally. A number of cases (see Schrems or Google/Spain) have given Europe a judicial track record of holding companies to account. In terms of legislation, the recent GDPR is respected and recognized as an important example of how to weigh privacy, security and innovation. Europe must rely on its role as a normative actor, a legitimate norm-setter to compliance pulls.

And even if Europe is not a leader in the global digital economy in areas like search engines and eTrade or is not the headquarter of hackers and crackers, with half a billion consumers, Europe has a strong market power. On its road into the digital future, Europe's manufacturing industry has a lot of potential. Europe has a highly developed educational system which is able to produce the skill sets needed for tomorrow's digital economy. Europe is now trying to leapfrog into the digital platform economy via pushing industry 4.0, Internet of Things (IoT) and Artificial Intelligence (AI). These issues need innovative, enabling governance. Europe can take up this baton, too. The rule of law is not a barrier to innovation, it stimulates creativity and is an enabler for sustainable development, economic growth and the emergence of the jobs of the future

Europe must step up with a more pronounced and vigorous internet governance agenda. That it can do so, is unquestionable. That it will do so, is not a foregone conclusion.

We call on Europe

  1. To recommit to the overarching goal of internet governance to contribute to securing world peace and international security (as enshrined in the UN Charter), ensuring human development (as codified in the Millennium Development Goals and the Sustainable Development Goals) and respecting, protecting and implementing human rights (as normatively ordered in the UDHR); and
  2. To recommit to internet governance as a policy priority and the achievements of multistakeholder governance as defined in the NetMundial Principles. This seems especially relevant in times when Brexit negotiations and the rise of populism and authoritarianism pose serious challenges to the internal stability of the EU; its time to look outward and reestablish Europe's role as a normative actor.
  3. In pursuing a vigorous normative approach to internet governance, Europe should not fall for technical determinism but premise all on the controlling power of normativity over technicity. Rather than letting a technical medium define our societal value, it is the values embedded in the normative order of the internet that define the evolution of the internet's underlying technologies through normative framing and regulatory interventions.
  4. The lesson can stick if Europe leverages past normative successes into stabilizing its influence as a governance actor. Europe's power lies in its position as an exporter of norms and legitimate norm-based internet governance initiatives which includes checks and balances and will hold all actors accountable.
  5. Sharpening Europe's normative edge will only work if Europe engages all stakeholders. Being a non-traditional actor as well, the EU has a long history of engaging non-state actors in legislative processes.
  6. All of this will only work if Europe commits to a "New Deal on Internet Governance". States, citizens and companies (in Europe and beyond) need to understand why cybersecurity, the digital economy and human rights matters and will be essential for well being of the world in the next decade of the 21st century.

A New Deal on Internet Governance

The best approach to ensuring cybersecurity, promoting digital economy and safeguarding digital rights is not pursuing a new treaty. A number of governments believe that new legally binding international instruments to protect national segments of the Internet are the "silver bullet" to make the Internet safe. But control, censorship, and surveillance are not the answers to the new challenges of the digital world. Europe 's approach has to be different.

Firmly grounded in human rights and international law, Europe must rather build support around the idea that the internet presents different iterations and scaling of existing problems. The challenge is to find the right balance among the various legitimate political, economic, cultural and social interests of states and non-state actors. The existing legal order is broad and flexible enough to deal with the new challenges. Adjustments could and should be made, where needed.

As said above, there is no need for an "Internet Treaty", similar to the "UN Law of the Sea Convention" or the "Paris Climate Pact". But there is a need for a new approach based on mutual trust and a political will to keep the Internet open, free, safe and unfragmented in the interest of both its four billion users of today and next billion users of tomorrow. Such a 'New Internet Governance Deal' has to be based on international law, multistakeholderism and accepted internet governance principles.

Within this 'New Deal' Europe's normative approaches can be put into four separate but interlinked baskets:

  1. Push for a "Digital Peace Plan" which includes norms for good behaviour of state and non-state actors in cyberspace and confidence-building measure (CBMCS) to counter adventurous policies which risk leading to digital disasters;
  2. Introduce and deploy a "Digital Marshall Plan" to promote the UN Sustainable Development Goals, to bring the next billion Internet users online and to create a framework for a free and fair trade of data and market-driven innovation;
  3. Develop a "Framework of Interpretation how to protect Human Rights in the Digital Age";
  4. Draft an "Ethical Guideline for the deployment of the Internet of Things and Artificial Intelligence".

a. Cybersecurity

Reducing the risks of a cyberwar is a common goal for internet governance endeavors, as is the introduction of confidence-building measures and norms for good state behavior in cyberspace. These are good foundations. The UNGGE Reports from 2013 and 2015, the OSCE and ASEAN recommendations on CBMCs, proposals by the Global Commission on Stability in Cyberspace (GCSC) and others. Further, Microsoft's proposals for a Digital Geneva Convention, the Tech Accord and the Digital Peace Campaign are now on the table. All this has provoked controversial discussions, and it remains to be seen how such ideas can be turned into a concrete political project. Elon Musk's proposal to ban killer robots goes in the same direction. Not only states, but also non-state actors as big private corporations and civil society organisations have no interest to be pulled into political power games, which could lead to a cyberwar.

Europe must take this initiatives and commitment one step further. It has to push for a framework which secures a peaceful and sustainable development of the Internet in the 2020s. There is a need to protect the public core of the Internet. The whole world depends today on the functioning of the Internet. Any attack against the basic functionality would lead to far-reaching disasters and should be seen as a crime against humanity. There is a need to enhance special protection for the electric power systems, for transportation, healthcare systems, and financial services as well as electoral procedures.

There are a lot of common interests even among states diverging in their internet policies. What is missing at the moment is the political will to translate these common interests into arrangements which will benefit all sides. As a flexible and credible provider of diplomatic solutions over decades, Europe can fulfill an important role here. The norm package, proposed by the Global Commission on Stability in Cyberspace, can be an important source of inspiration.

b. Digital Economy

A 'Digital Marshall Plan' should be adopted and deployed to increase the productive forces within global trade relations regarding the internet and improve development opportunities for all. As an economic powerhouse, Europe is well suited to encourage trade. As a block of nations that have been discussing common trade policies for over five decades, the EU, especially, is ideally placed to ensure that the digital economy is on every political agenda.

The UN 2030 Agenda for Sustainable Development identified the building of resilient infrastructure, the promotion of inclusive and sustainable industrialization and the fostering of innovation as key goals of sustainable development. In Target 9.c of the Sustainable Development Goals (SDG), states commit to "[s]ignificantly increase[ing] access to information and communications technology and striv[ing] to provide universal and affordable access to the internet in the least developed countries by 2020". There exists thus a commitment by UN member states to strive for universal internet access by 2020, which is deeply connected to increases in digital trade. Even if this commitment is difficult to realize, the importance of the commitment which evidences states' opinion vis-à-vis the internet is hard to overstate. Committing to universal access means, by implication, that internet integrity as a precondition for meaningful access needs to be ensured and is therefore in the common interest. Europe can do a lot to bring the next billion Internet user online.

c. Human Rights in the Digital Age

As a region with a strong track record of human rights protection, Europe must make sure to orient all policies towards the human being. As the NetMundial Declaration from 2014 has reaffirmed, Internet Governance has to be based on the respect of human rights. There is no need to invent "new human rights." But there is a need to analyze the implications of new technological developments for the existing human rights. This is relevant in particular for the right to freedom of expression and the right to privacy.

The UN Human Rights Council has appointed Special Rapporteurs for both of these rights, who are functioning as watchdogs, produce critical reports to the UN General Assembly and make their own suggestions on how to strengthen the protection of human rights in cyberspace. Europe must continue its support for these initiatives.

There is space for some adjustments of existing international legal instruments, but there is no need for an "international law 2.0". The emergence of the internet and the pervasiveness of ICTs in today's societies have not fundamentally changed or challenged neither the UN Charter (1945) nor the Universal Declaration of Human Rights (1948). Recall the WSIS documents referring to the importance of international law and human rights (2005 & 2015) or the commitments by the UNGGE (2013 & 2015).

But these commitments have not (yet) been stabilized by conventional norms. Normative preferences for a rule of law-based international internet-related governance model are counterindicated by destabilizing state actions including cyberattacks, pervasive state surveillance via the internet and attempts by states to create new barriers to against a free and fair flow of information and data and to re-nationalize the global Internet.

Looking forward towards a reinvigorated IGF

Since its foundation in 2006, the UN based Internet Governance Forum (IGF) is the place, where all those conflicts and controversies have been discussed in free and open conversations. It is certainly true that the IGF has some weaknesses. The UNCSTD IGF Improvement Working Group has made some recommendations which have been reaffirmed by the UN General Assembly in its WSIS+10 Resolution in December 2015. The Multistakeholder Advisory Group (MAG) and the IGF Secretariat are working hard to implement those recommendations, however, the whole process is still underfinanced and understaffed. Progress is slow but there is improvement, including more intersessional work, more tangible output, more interlinkage with national and regional initiatives.

Unfortunately, some governments merely pay lip service to the multistakeholder approach but prefer to negotiate Internet-related issues behind closed doors bilaterally. Also, many private Internet corporations have not yet discovered the IGF as "place where one has to show up". In doing so, they rob the IGF of their full potential as a global clearinghouse of internet governance innovation.

Three subsequent IGFs taking place in Europe (Geneva 2017, Paris 2018 and Berlin 2019) have put Europe into the spotlight of the global Internet governance discussion. This gives Europe — its governments, but also its IT companies, non-governmental organizations, the technical and academic community — a chance to show the advantages of its normative approach. EURODIG, the European IGF, has in the past innovated the IGF processes with new ideas, including interactive formats of sessions, tangible output in form of clear and short messages, a Youth IGF, open calls for themes, and decentralized and bottom-up management procedures. All this can help to push the IGF forward to the next level.

We have shown how Europe can contribute to a reinvigorated IFD and strengthen its position as a legitimate global leader in Internet governance. Together with other stakeholders in their respective roles, Europe should engage in a forward-looking process of establishing the contours of a 'New Deal for Internet Governance'.

This "New Deal" could take the form of a legally non-binding framework of commitments by state and non-state actors on how to stabilize and develop cyberspace for the benefit of all. Such a "New Deal" would go beyond normative precursors, such as the "Global Compact" proposed by the Bildt Commission (2016) and the "Magna Charta" proposed recently by Tim Barners-Lee at the Lisbon Web-Summit. Such a "New Deal" would stimulate a process for the 2020s where WSIS+20 (2025) and the review of the UN SDGs (2030) are already marked as milestones in the Internet Governance calendar.

In light of the challenges ahead, we end as we started: Internet Governance matters greatly. And Europe's time to act on this is now.

This contribution is based on Kettemann/Kleinwächter/Senges, The Time is Right for Europe to Take the Lead in Global Internet Governance, Normative Orders Working Paper 2/2018.

[1] Wolfgang Kleinwächter, Professor Emeritus at the University of Aarhus, is a member of the Global Commission on Stability in Cyberspace;
[2] Matthias C. Kettemann is a postdoctoral fellow at the Cluster of Excellence "Normative Orders", University of Frankfurt/Main;
[3] Max Senges is a Visiting Scholar at Stanford's Center on Democracy, Development, and the Rule of Law (CDDRL) and a Senior Program Manager at Google Germany.

Written by Wolfgang Kleinwächter, Professor Emeritus at the University of Aarhus

Follow CircleID on Twitter

More under: Cybersecurity, Internet Governance, Policy & Regulation

Categories: News and Updates

Cyber Security Word Salad

Thu, 2018-11-08 00:31

Two months ago, the Trump White House published its National Cyber Strategy. It was followed a few days ago with the release of its draft NSTAC Cybersecurity "moonshot."

The Strategy document was basically a highly nationalistic America-First exhortation that ironically bore a resemblance to China's more global two-year-old National Cybersecurity Strategy.

However, the Moonshot draft comes across as a Public Relations gambit meant to underpin the Strategy pronouncement by borrowing on the Von Braun project pitched to President Kennedy and implemented in the 1960s as the Apollo program. Apart from the rather ludicrous comparison, the draft itself serves up little more than another cybersecurity word salad found around Washington with six "strategic pillars" sprinkled on top. We are told that these pillars achieve "a more enduringly safe and secure Internet within the next 10 years [that] will require a holistic and multi-disciplinary approach." A "word salad" rendition of the draft is attached as an image.

These kinds of documents have appeared everywhere around the world over the past decade. Perhaps not unexpectedly, they all tend to have the same salad ingredients: Technology, Human Behavior, Education, Ecosystem, Privacy, and Policy. NATO has an extensive library of them.

And, almost every regional and global organization and intergovernmental body today have their own versions. The EU has several, and nearly two hundred Nation States at the ITU Plenipotentiary at the moment, are redrafting a bundle of them.

There is not much new in the NSTAC draft except the Moonshot packaging plus potentially creating a few new mini-government bureaucracies among existing government agencies to oversee the effort and lobby for additional funding. The last point — funding — figures prominently into the recommendations even as the document plainly offers nothing substantively new.

The report places considerable faith in "U.S. Government leadership" when the historical record in creating joint efforts like SEMANTECH and MCC have been problematic at best in sectors far less abstruse. Furthermore, as opposed to the UK's NCSC, the aversion within the U.S. to supporting its most valuable expert Information Assurance assets at NSA, creates an enduring institutional dysfunction. Additionally, scores of other national government agencies and thousands of companies and institutes scattered globally are pursuing similar well-funded initiatives that are largely unknown within the U.S. government, and with no ability to discover them and bring about convergence and harmonization.

What is most unfortunate is the model itself — which suggests there is some kind of achievable endpoint of cybersecurity. The complexities and dynamics of contemporary electronic components, code, and networks — coupled with business economics, adversarial incentives, legal constraints, and human foibles — result in an ecosystem where risk management and cyber-hygiene are the necessary courses of action.

On the positive side, the draft recommendations do harken back to a period when NSTAC hosted its own R&D;expert community and regular R&D;workshops. There are, however, several faux pax. While the draft repeatedly mentions that 5G is extremely important and that it will replace existing internets, it somewhat embarrassingly in the Glossary does not know where 5G work is done (i.e., 3GPP and NFV ISG) and that it is already being rolled out. The lack of engagement by U.S. government agencies in existing 5G industry technical developments has long been endemic.

More significantly, the report continues to push the politically motivated "open internet" when NSTAC was warned two decades ago by the DARPA Director who approved the TCP/IP platform development — that the "open internet" notion was flawed and meaningful cybersecurity is fundamentally impossible with open internets. Indeed, the dangers of open internets have come vividly home to roost over the past year courtesy of Russia's FSB and GRU.

Fortunately, the legacy DARPA internets are rapidly transitioning to a world of virtually instantiated network slices under a 5G aegis. While considerable attention is being devoted to 3GPP and related venues to security, it is unclear whether unknown and unforeseen vulnerabilities and attacks will not emerge.

Written by Anthony Rutkowski, Principal, Netmagic Associates LLC

Follow CircleID on Twitter

More under: Cybersecurity, Internet Governance, Policy & Regulation

Categories: News and Updates

BGP Hijacks: Two More Papers Consider the Problem

Tue, 2018-11-06 18:14

The security of the global Default Free Zone (DFZ) has been a topic of much debate and concern for the last twenty years (or more). Two recent papers have brought this issue to the surface once again — it is worth looking at what these two papers add to the mix of what is known, and what solutions might be available. The first of these —

Demchak, Chris, and Yuval Shavitt. 2018. "China's Maxim — Leave No Access Point Unexploited: The Hidden Story of China Telecom's BGP Hijacking." Military Cyber Affairs 3 (1).

— traces the impact of Chinese "state actor" effects on BGP routing in recent years. Whether these are actual attacks, or mistakes from human error for various reasons generally cannot be known, but the potential, at least, for serious damage to companies and institutions relying on the DFZ is hard to overestimate. This paper lays out the basic problem, and the works through a number of BGP hijacks in recent years, showing how they misdirected traffic in ways that could have facilitated attacks, whether by mistake or intentionally. For instance, quoting from the paper:

  • Starting from February 2016 and for about 6 months, routes from Canada to Korean government sites were hijacked by China Telecom and routed through China.
  • On October 2016, traffic from several locations in the USA to a large Anglo-American bank
  • headquarters in Milan, Italy was hijacked by China Telecom to China.
  • Traffic from Sweden and Norway to the Japanese network of a large American news organization was hijacked to China for about 6 weeks in April/May 2017.

What impact could such a traffic redirection have? If you can control the path of traffic while a TLS or SSL session is being set up, you can place your server in the middle as an observer. This can, in many situations, be avoided if DNSSEC is deployed to ensure the certificates used in setting up the TLS session is valid, but DNSSEC is not widely deployed, either. Another option is to simply gather encrypted traffic and either attempt to break the key or use data analytics to understand what the flow is doing (a side channel attack).

What can be done about these kinds of problems? The "simplest" — and most naïve — answer is "let's just secure BGP." There are many, many problems with this solution. Some of them are highlighted in the second paper under review —

Bonaventure, Olivier. n.d. "A Survey among Network Operators on BGP Prefix Hijacking — Computer Communication Review." Accessed November 3, 2018.

— which illustrates the objections providers have to the many forms of BGP security that have been proposed to this point. The first is, of course, that it is expensive. The ROI of the systems proposed thus far are very low; the cost is high, and the benefit to the individual provider is rather low. There is both a race to perfection problem here, as well as a tragedy of the commons problem. The race to perfection problem is this: we will not design, nor push for the deployment of, any system which does not "solve the problem entirely." This has been the mantra behind BGPSEC, for instance. But not only is BGPSEC expensive — I would say to the point of being impossible to deploy — it is also not perfect.

The second problem in the ROI space is the tragedy of the commons. I cannot do much to prevent other people from misusing my routes. All I can really do is stop myself and my neighbors from misusing other people's routes. What incentive do I have to try to make the routing in my neighborhood better? The hope that everyone else will do the same. Thus, the only way to maintain the commons of the DFZ is for everyone to work together for the common good. This is difficult. Worse than herding cats.

A second point — not well understood in the security world — is this: a core point of DFZ routing is that when you hand your reachability information to someone else, you lose control over that reachability information. There have been a number of proposals to "solve" this problem, but it is a basic fact that if you cannot control the path traffic takes through your network, then you have no control over the profitability of your network. This tension can be seen in the results of the survey above. People want security, but they do not want to release the information needed to make security happen. Both realities are perfectly rational!

Part of the problem with the "more strict," and hence (considered) "more perfect" security mechanisms proposed is simply this: they are not quite enough. They expose far too much information. Even systems designed to prevent information leakage ultimately leak too much.

So… what do real solutions on the ground look like?

One option is for everyone to encrypt all traffic, all the time. This is a point of debate, however, as it also damages the ability of providers to optimize their networks. One point where the plumbing allegory for networking breaks down is this: all bits of water are the same. Not all bits on the wire are the same.

Another option is to rely less on the DFZ. We already seem to be heading in this direction, if Geoff Huston and other researchers are right. Is this a good thing, or a bad one? It is hard to tell from this angle, but a lot of people think it is a bad thing.

Perhaps we should revisit some of the proposed BGP security solutions, reshaping some of them into something that is more realistic and deployable? Perhaps — but the community is going to let go of the "but it's not perfect" line of thinking, and start developing some practical, deployable solutions that don't leak so much information.

Finally, there is a solution Leslie Daigle and I have been tilting at for a couple of years now. Finding a way to build a set of open source tools that will allow any operator or provider to quickly and cheaply build an internal system to check the routing information available in their neighborhood on the 'net, and mix local policy with that information to do some bare bones work to make their neighborhood a little cleaner. This is a lot harder than "just build some software" for various reasons; the work is often difficult — as Leslie says, it is largely a matter of herding cats, rather than inventing new things.

Written by Russ White, Network Architect at LinkedIn

Follow CircleID on Twitter

More under: Access Providers, Cybersecurity, Networks

Categories: News and Updates

China Telecom Accused of Misdirecting Internet Traffic

Mon, 2018-11-05 21:53

A paper published by the Naval War College titled, "China's Maxim – Leave No Access Point Unexploited: The Hidden Story of China Telecom's BGP Hijacking," accuses the Chinese government of manipulating BGP routing in order to intercept internet traffic. Doug Madory, Oracle's Director of Internet Analysis, who was involved with the 2017 activities to stop the effort, says there is some truth to the paper's assertion but does not claim to know the motivation behind the actions. He writes: "On 9 December 2015, SK Broadband (formerly Hanaro) experienced a brief routing leak lasting little more than a minute. During the incident, SK's ASN, AS9318, announced over 300 Verizon routes that were picked up by OpenDNS's BGPstream service ... Over the course of several months last year, I alerted Verizon and other Tier 1 carriers of the situation and, ultimately, Telia and GTT (the biggest carriers of these routes) put filters in place to ensure they would no longer accept Verizon routes from China Telecom."

Follow CircleID on Twitter

More under: Access Providers, Networks

Categories: News and Updates

Berners-Lee Launches Global Campaign to Save the Web From Destruction

Mon, 2018-11-05 20:37

Tim Berners-Lee has called on governments, companies and individuals to back a new "Contract for the Web" that aims to protect people's rights and freedoms on the internet. The global campaign aims to save the web from the destructive effects of abuse and discrimination, political manipulation, and other threats that plague the online world. Ian Sample reporting in The Guardian: "The contract outlines central principles that will be built into a full contract and published in May 2019, when half of the world's population will be able to get online. More than 50 organisations have already signed the contract, which is published by Berners-Lee's World Wide Web Foundation alongside a report that calls for urgent action."

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation, Privacy, Web

Categories: News and Updates

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer