Domain industry news

Syndicate content CircleID
Latest posts on CircleID
Updated: 1 hour 48 min ago

"Objective" and "Objectivity" in UDRP Decision Making

Thu, 2020-09-24 22:38

No one will disagree that disputes before arbitral tribunals and courts should be determined on the merits. I have noticed that some Panels appointed under the Uniform Domain Name Dispute Resolution Policy (UDRP) have employed the words "objective" and "objectively" in their recent decisions. In pondering these linguistic choices, it seems to me that there are two possible reasons for their use; the first is more acceptable than the second. One involves explaining the policy underlying the UDRP and the other to assure readers that the outcome is not based on subjective sympathies. Drawing inferences, of course, is a principle tool of reasoning, but it can be misused, none more so than in the absence of concrete facts. The thinner the record, the less justification for drawing inferences, and when such inferences are drawn from thin records, it can be suspiciously self-justifying to claim objectivity.

The first reason for the choice, which is perfectly respectable, is illustrated in Fundación Trinidad Alfonso Mocholí, Fundación de la Comunitat Valenciana v. Jack Zhang, D2020-1543 (WIPO August 4, 2020). In this case, the Panel explains that "[i]n the context of domainers, panels have generally assessed the issue of registration in bad faith objectively” (emphasis added). This is no more than saying that the assessment of the facts will not be processed through the decision-maker's subjective bias. Objectively, the Panel is assuring the public, is the model for UDRP decision making. It is that which is to be strived for. Parties need to be reassured that decision-makers will not cross the line into subjectivity.

The same point is further illustrated in Putzmeister Engineering GmbH v. Domains By Proxy, LLC / David Adams, D2020-1454 (WIPO September 5, 2020) (<putzmeisters.com>. In Fundación Trinidad, the Panel is assessing the facts to determine the Complainant's case; in Putzmeister the Panel is assessing the Respondent's case. Having examined facts — which included proof that the domain name was virtually identical to the mark — it concluded that "[o]n any objective view, the Respondent is not a reseller with a legitimate interest in a domain name incorporating a manufacturer's mark, such that it could meet the tests set out in Oki Data Americas, Inc. v. ASD, Inc., [D2001-0903 (WIPO November 6, 2001)]. Nor, alternatively, is the Respondent commonly known by the Disputed Domain Name" (emphasis added). The Panel has a thick record, which makes the inferences reliable in the absence of rebuttal facts.

In Intesa Sanpaolo S.p.A. v. Stefano Santino, CAC 103218 (ADR.eu September 9, 2020) (<intesa-sanpaolo-sicurezza-okey.com> the Panel underscored another perfectly suitable use of "objectively," this time by tying the conclusion to a precedential construction of the policy:

As regards to the first aspect, the Complainant has already extensively proved the renown[ ] of its trademarks. For what concern the second circumstance, it must be underlined that it is objectively not possible to understand what kind of use the Respondent could make with a domain name which does exactly correspond to the Complainant's trademarks and that results so similar to the Complainant's domain names currently used by the latter to provide online banking services for enterprises. (Emphasis added).

We expect Panels to exercise their power objectively; to avoid tilting on the basis of suspicion. We expect this because the Panel's power is of such immensity that it can take away and forfeit a respondent's property. Objectivity is not always achieved, however. Panelists have stepped into error, either because they apply the wrong law or because they believe they are being objective even when their determinations are questionably so. This point is illustrated in a number of ill-determined awards over the years, some of which have been vacated in challenges under the Anticybersquatting Consumer Protection Act and others that have not been challenged because of the prohibitive cost of litigation. There is, I think, a temptation to find liability where the registration of a domain name is suspicious simply because it is identical to the mark even though the term is not conclusively associated only with complainant. But in many of these cases, Panels reached their conclusion on thin records and have stepped into error.

An illustration of this temptation is Aveve N.V. v. Privacy Administrator, Anonymize, Inc. / Dennis Koorn, D2020-1115 (WIPO September 2, 2020) (<arvesta.com>). The essential fact in this narrative is that Respondent was the winning bidder of the domain name in a public auction. It did not approach the Complainant. The Complainant had, in the recent past, rebranded itself as ARVESTA, invoking "Harvest" and other land related connotations. The Complainant could have equally engaged in the public auction, but it failed to do so. Nevertheless, the Complainant was "suspicious," not on any particular evidence, but like a virus, it infected the majority to agree, and although there was no direct evidence to support its believe, it chose to accept the allegation as though it was true.

The Panel repeated its belief not once but thrice — a case I think of the "[majoriy] protesteth too much" (a point made in the lengthy dissent):

The objective evidence, however, discloses that the Respondent appears to have undertaken considerable research into the disputed domain name before bidding for it, given the Respondent supports its claim to the disputed domain name by reference to the third party uses and Estonian and Finnish usage referred to above. It appears highly likely that those searches must have revealed the Complainant and its rights in the name "Arvesta", as shown by the Respondent's own evidence (see the Google searches as mentioned above).

The objective reasons leading to the Panel's conclusions that the Respondent does not have rights or legitimate interests in the disputed domain name lead also to the finding that the Respondent registered and has used the disputed domain name in bad faith.

On the objective evidence included in the record in this case, and the inferences arising from that evidence, the majority considers that standard has been satisfied in this case for the reasons set out above. (Emphasis added in all three quotations).

What the "object evidence" is we never learn; it exists in the mind. What we learn from the actual facts pleaded is that the record is extremely thin; there are no concrete facts except that the Respondent did some research before offering the winning bid. In other words, the inferences drawn by the majority are based on suspicion and conjecture alone rather than evidence, even as the majority claims to have relied on "objective evidence."

Where facts justify, of course, inferences can be drawn. But where facts cannot be decisively assessed for want of proof, the complaint must be dismissed. A Panel's willingness to base a decision on a thin record increases the likelihood that it is made on subjective grounds, assertions of objectivity to the contrary notwithstanding, and that kind of decision-making undercuts expectations of neutrality. As on a continuum, the thinner the record, the less justification to deduce what, in effect, results in drawing something from nothing. Where suspicion becomes the fulcrum of liability, there is, purely and simply, error.

Written by Gerald M. Levine, Intellectual Property, Arbitrator/Mediator at Levine Samuel LLP

Follow CircleID on Twitter

More under: Domain Management, Domain Names, Brand Protection, UDRP

Categories: News and Updates

Internet Bifurcation: Will the US-China Digital Arm-Twisting Splinter the Open and Free Internet?

Thu, 2020-09-24 20:18

The Internet controversy between the US and China is escalating. The Trump Administration is fighting against Huawei, TikTok and We Chat. China is pushing back with new export regulations for Chinese IT technology. August 5, 2020 the US State Department launched a "Clean Network" initiative, aimed to remove Chinese digital corporations from the global supply chain in today's interconnected world.1 September 8, 2020 the Chinese Foreign Ministry replied with a "Data Security" initiative, aimed to enhance global cybersecurity in "Chinese colours."2 So far, the UN was one of the main battlefields for US-Chinese Internet controversies. But now it gets much more concrete, and for the moment, the real theater of the digital arm-twisting between the two cyber superpowers is the continent of Europe.

In August 2020, US Foreign Minister Mike Pompeo traveled to four European countries (Slowenia, Czech Republic, Austria and Poland). He gave a strategic speech in Prague on August 12, 2020 where he attacked China and offered Europe membership in a new coalition of like-minded countries for a "clean network."3 At the end of August 2020, the Chinese Foreign Minister Wang Yi traveled to five other European countries (Italy, Norway, the Netherlands, France and Germany). He delivered his strategic speech in Paris on August 30, 2020 where he rejected the "Clean Network" approach and recognized Europe as "a major force in a multi-polar world."4

The two speeches from Prague and Paris did signal to the rest of the world that a strong cyberstorm is coming. The free, open, global and interoperable Internet, as we know it since the 1990s, is moving towards an unprecedented stress test. An Internet Bifurcation is on the horizon.

Pompeo's Pague Speech

In Prague, Mike Pompeo attacked the Chinese Communist Party (CCP): "The CCP lies, and makes those who tell the truth disappear. The regime has a Marxist-Leninist core no less than the Soviet Union did, and indeed, perhaps more so. The party has always put itself first. Its actions flow from its ideology. And it's paranoid about free societies. What's happening now isn't Cold War 2.0. The challenge of resisting the CCP threat is, in some ways, much more difficult. That's because the CCP is already enmeshed in our economies, in our politics, in our societies in ways the Soviet Union never was."

He advertised the "Clean Network" initiative, which is aimed to remove Chinese companies from global supply chains in fields like apps, cloud, cable, stores and carriers. He called for an alliance "of countries and companies who refuse to sacrifice cybersecurity just to save a little bit of money" and argued that the growing threat needs a comprehensive opposition against China on all fronts at the digital battlefield. "We have to explain to our citizens the price free societies will pay if we don't confront this threat. We have to explain what kind of scrutiny we must give to Chinese investment and why. And we have to talk to them about what sorts of alliances are needed to be built between the United States and Europe and around the world, and how we will retool to withstand and resist this threat." He reiterated what he said in his Press Statement on the "Clean Network" initiative: "The United States calls on our allies and partners in government and industry around the world to join the growing tide to secure our data from the CCP's surveillance state and other malign entities. Building a Clean fortress around our citizens' data will ensure all of our nations' security."

The "Clean Fortress" language sounds like the "Iron Curtain" language, the former British Prime Minister Winston Churchill used in his famous Fulton speech on March 5, 1946. Churchill's speech, a couple of months after the end of WWII, the drop of the nuclear Hiroshima bomb and the "Potsdam Agreement" from August 1945 (where Stalin, Roosevelt and Churchill disagreed about the future of Europe) is seen today as the start of the "Cold War" between the US and the Soviet Union.5 Will Pompeos "Clean Network Fortress" mark the beginning of building a "digital iron curtain" which will split the global Internet space into two cyberworlds?

Wang's Paris Speech

Wang Yi's speech in Paris avoided any confrontational tone, but he was also clear that the Chinese understanding of cyber sovereignty has to be the starting point for enhanced digital cooperation. Wang said in Paris: "China firmly opposes any schemes to create a new Cold War, and will not allow any force to deny the right of the Chinese people and people around the world to pursue development and a better life." Wang supported "multilateralism" and opposed "unilateral acts of bullying." He called for free digital trade and criticized strategies of decoupling: "With China deeply interconnected with the world, decoupling from China means decoupling from development opportunities and from the most dynamic market." Wang offered Europe a new level of cooperation: "We are prepared to work with the EU to uphold the effectiveness and authority of the multilateral system, promote fairness and justice, and maintain the international order. As two major economies in the world, China and Europe must stay committed to free trade, safeguard the stability of global industrial and supply chains, and play a key role in promoting development and prosperity in the post-COVID-19 world. China and Europe need to set an example of advancing global governance by jointly strengthening the UN's coordinating role in international affairs. We need to reject the practices of putting one's own country first at the expense of others. We need to jointly build a community with a shared future for mankind."

The eight points of the "Global Initiative on Data Security" are carefully drafted and uses, inter alia, language from documents, adopted by the EU or drafted by the Global Commission on Stability in Cyberspace.

The Chinese initiative calls for an "open, secure and stable supply chain of global ICT products and services." It says that "States should stand against ICT activities that impair or steal important data of other States' critical infrastructure, or use the data to conduct activities that undermine other States' national security and public interests.". It invites states to "oppose mass surveillance against other States and unauthorized collection of personal information of other States with ICTs as a tool." It rejects backdoors in IT products and services and calls on states "not request domestic companies to stare data generated on obtain overseas in their own territory". If law enforcement agencies needs data to combat crime this should be done "through judicial assistance or other relevant multilateral and bilateral agreements" which would "not infringe upon the judicial sovereignty and data security of a third State". In general, States "should respect the sovereignty, jurisdiction and governance of data of other States, and shall not obtain data located in other States through companies or individuals without other States' permission." And "ICT companies should not seek illegitimate interests by taking advantage of users' dependence on their products, nor force users to upgrade their systems and devices. Products providers should make a commitment to notifying their cooperation partners and users of serious vulnerabilities in their products in a timely fashion and offering remedies."

Wang's reply to Pompeo's attack looks very soft and diplomatic. It is not "tit-for-tat." It is embedded in a long term Chinese strategy which looks at 2030 and beyond. However, the problem with the 8-point-plan is less the proposed language. The problem is the missing language. The initiative speaks about duties of states, but there is nothing about rights of individuals. It opposes "mass surveillance against other states" but doesn't say anything about mass surveillance against citizens. It argues for an "open, secure and stable" Internet, but there is no call for a "free, unfragmented and interoperable" Internet.

What can be concluded from the two speeches? Is it "American Values" vs. "Chinese Values"? What about the rest of the world?

UN Roadmap on Digital Cooperation

Neither Pompeos' "Clean Network", nor Wang's "Data Security" references explicitly the "Roadmap on Digital Cooperation", launched by UN Secretary General Antonio Guterres on June 11, 2020.6 The UN-Roadmap is based on the report of a UN High-Level Panel, co-chaired by an American woman, Melinda Gates from the Microsoft Foundation, and a Chinese man, Jack Ma from Ali Baba. The Panel titled its final report "The Age of Cyberinterdependence."7 Interdependence means that we all are sitting in the same boat. There is one world and one Internet. Political zero-sum games, as we know it from the 19th and 20th century, will see no winners in the 21st century. Efforts to escape from the interconnected world will have unintended and uncalculable side effects with high costs for all sides.

When Guterres launched the Roadmap, he said: "We are at a critical point for technology governance. Digital connectivity is indispensable, both to overcome the pandemic and for a sustainable and inclusive recovery. But we cannot let technology trends get ahead of our ability to steer them and protect the public good. If we do not come together now around using technology for good, we will lose a significant opportunity to manage its impact, and we could see further fragmentation of the Internet, to the detriment of all."8

The Roadmap itself offers a mid-term-strategy for the way forward in eight areas — from digital inclusion and sustainable development until artificial intellicence and the building of an enhanced institutional mechanism as an IGF+.9

In his speech at the 75th UN anniversary, September 22, 2020, Guterres said: "Today we have a surplus of multilateral challenges and a deficit of multilateral solutions. National sovereignty — a pillar of the United Nations — goes hand-in-hand with enhanced international cooperation based on common values and shared responsibilities in pursuit of progress for all. No one wants a world government — but we must work together to improve world governance. In an interconnected world, we need a networked multilateralism, in which the United Nations family, international financial institutions, regional organizations, trading blocs and others work together more closely and more effectively. We also need an inclusive multilateralism, drawing on civil society, cities, businesses, local authorities and more and more on young people."10

For the last three decades, globalization and digitalization were seen as the best strategy to solve the world's problems. Zero-sum games were over. There was a strong belief that the world is moving from a system build around walls to a system build around networks. Based on a new spirit of mutual understanding and mutual trust after the Berlin wall came down, win-win strategies dominated global policymaking. The keywords in Guterres speech — "common values," "shared responsibility," "networked and inclusive multilateralism" — reflect this spirit.

But globalization and digitalization, which have created the complex networks with this new level of interdependence, have also created a new level of vulnerability. If somebody in the network doesn't function anymore as expected or misuses its special opportunities to bring chaos and confusion to potential adversaries, the whole system moves at the brink of collapse. And part of today's reality is also that the number of governments is growing, which feels that in new zero-sum-games they can win more than others. The risk is high, that we move into a world build around "fortresses".11

Chained Globalisation

Henry Farrell and Abraham Newman introduced recently in "Foreign Affairs"12 a new terminology:"chained globalization." They argue that "governments recognizing how many dangers come with interdependence." But the problem is that "the economies of countries such as China and the US are too deeply entwined to be separated — or "decoupled" — without causing chaos. States have little or no ability to become economically self-reliant. Hawks in Beijing and Washington may talk about a new Cold War, but there is today no way to split the world into competing blocs."

Under chained globalization, "states will be bound together by interdependence that will tempt them to strangle their competitors through economic coercion and espionage, even as they try to fight off their rivals' attempts to do the same. In some ways, chained globalization makes the Cold War seem simple. The economies of the Western and Soviet camps shared few points of contact and thus offered few opportunities for economic coercion. The situation today is far messier. The world's powers are enmeshed in financial, trade, and information networks that they do not fully understand, raising the risk of blunders that could set off dangerous conflicts. Accepting and understanding the reality of chained globalization must be the first step toward limiting those risks. Policymakers cannot cling to fantasies of either decoupled isolation or benign integration. Like it or not, the United States is bound to its competitors. Since it cannot break those bonds, it must learn to master them."

In other words: It is impossible to remove the conflicts and contradictions. The big challenge is to manage the controversies by building bridges between very different value systems. How can this be done? Does the "UN-Roadmap on Digital Cooperation" offer a "third way" based on a "shared vision" of "universal values" via "networked and inclusive multilateralism"?

The heads of the 193 UN Member States, including China and the US, used the 75th UN-anniversary to adopt a Declaration on September 16, 2020, which includes a commitment to improving digital cooperation. It says: "Digital technologies have profoundly transformed society. They offer unprecedented opportunities and new challenges. When improperly or maliciously used, they can fuel divisions within and between countries, increase insecurity, undermine human rights, and exacerbate inequality. Shaping a shared vision on digital cooperation and a digital future that shows the full potential for beneficial technology usage, and addressing digital trust and security, must continue to be a priority as our world is now more than ever relying on digital tools for connectivity and social-economic prosperity. Digital technologies have the potential to accelerate the realization of the 2030 Agenda. We must ensure safe and affordable digital access for all. The United Nations can provide a platform for all stakeholders to participate in such deliberations."13

Europe as Guardian of a Shared Vision for a Free and Interoperable Internet?

The language of this new UN-Declaration could have also come from Brussels. For years, the EU, which does not host big Internet corporations like the US and China, sees itself as a global guardian for an open, free, secure, safe, unfragmented and interoperable Internet. The EU is developing regulation which balances individual rights and freedoms with state duties and legitimate business interests. In the field of data protection, the European GDPR did have a tremendous global impact.

Already at the IGF 2018 in Paris, the French President Emanuel Macron argued against a polarization of the global Internet Governance discussion: "I believe we need to move away from the false possibilities we are currently offered, whereby only two models would exist: a complete self-management, without governance, and a compartmented Internet, entirely monitored by strong and authoritarian states." And he added: "To be very politically incorrect, we are seeing two types of Internet emerge: there is a Californian form of Internet, and a Chinese Internet." His conclusion was to take a "new path where governments, along with Internet players, civil societies and all actors are able to regulate properly."14

The German Chancellor Angela Merkel used similar arguments in her speech at the IGF 2019 in Berlin: "We have to clarify what we mean when, on the one hand, we want to retain our digital sovereignty but, on the other, we want to act multilaterally, and not shut ourselves off. Of course, digital sovereignty is very important. But it may be that we all have come to understand something different by that, even though we are using the same term. As I understand it, digital sovereignty does not mean protectionism, or that state authorities say what information can be disseminated — censorship, in other words; rather, it describes the ability both of individuals and of society to shape the digital transformation in a self-determined way." As an example, she referred to her home country: "We in Germany know that technological innovations do not just happen, that companies do not simply evolve automatically, but that they always need parameters and guidelines. That was the case in the industrial revolution, and it will need to be the same in the internet age. In other words, we need sovereignty over what happens. And so, if we are convinced that isolationism is not an expression of sovereignty, but that we have to base our actions on a shared understanding and shared values, then precisely that — a commitment to a shared, free, open and secure global internet — is in fact an expression of sovereignty".

Merkel also positioned herself towards Internet fragmentation: "What would the consequences be if we went down the road of isolationism? To my mind, the consequences of an increasingly fragmented internet can never be good. They can be many and varied, but never good. The global infrastructure could become unstable and vulnerable to attack. There would be more surveillance. The state would increasingly filter and censor information. Perhaps the Internet and mobile phone networks would even be shut down in order to prevent the people from communicating. This means that an attack on internet connectivity, the foundation for a free and open internet, has become a dangerous political instrument. Attacks like this can deprive the people of their fundamental rights to information and communication. This turns the idea underlying the Internet, the idea of its inventors, completely on its head. And so we should all be determined to protect the heart of the internet as a global public good."15

This spirit also guided the "State of the Union Address" by EU President Ursula von der Leyen on September 16, 2020, when she called for a Digital European Decade. "We need a common plan for digital Europe with clearly defined goals for 2030, such as for connectivity, skills and digital public services. And we need to follow clear principles: the right to privacy and connectivity, freedom of speech, free flow of data and cybersecurity. But Europe must now lead the way on digital — or it will have to follow the way of others, who are setting these standards for us." Von der Leyen also was clear that Europe supports both multilateralism and multistakeholderism if it comes to the governance of the Internet: "We are firm believers in the strength and value of cooperating in international bodies. But the truth is also that the need to revitalize and reform the multilateral system has never been so urgent. Our global system has grown into a creeping paralysis. Major powers are either pulling out of institutions or taking them hostage for their own interests. Neither road will lead us anywhere. Yes, we want to change. But change by design — not by destruction. We know that multilateral reforms take time, and in the meantime, the world will not stop. Without any doubt, there is a clear need for Europe to take clear positions and quick actions on global affairs."16

Von der Leyen announced a new European investment of 8 billion Euros into supercomputers. "This is why we want to focus our investments on secure connectivity, on the expansion of 5G, 6G and fiber. NextGenerationEU is also a unique opportunity to develop a more coherent European approach to connectivity and digital infrastructure deployment. None of this is an end in itself — it is about Europe's digital sovereignty, on a small and large scale."

The risk of a bifurcated Internet raises growing concerns around the world. It is not only Europe that is challenged in a new way to navigate through the stormy cyber waters. It is also Africa, Latin America and Asia. Just recently — in the August 2020 edition of the "Foreign Affairs" magazine — the Primeminister of Singapore, Lee Hsien Loong, said: "Asia Pacific see it in their best interests to maintain good relations with both China and the US while supporting other regional powers. The entire region could pay a huge price if it has to pick a side, or if nations are forced to choose self-preservation over multilateral cooperation."17

  1. https://www.state.gov/announcing-the-expansion-of-the-clean-network-to-safeguard-americas-assets/ 
  2. https://www.fmprc.gov.cn/mfa_eng/zxxx_662805/t1812951.shtml 
  3. https://www.state.gov/securing-freedom-in-the-heart-of-europe/ 
  4. https://www.fmprc.gov.cn/mfa_eng/zxxx_662805/t1811319.shtml 
  5. https://winstonchurchill.org/resources/speeches/1946-1963-elder-statesman/the-sinews-of-peace/ 
  6. https://www.un.org/en/content/digital-cooperation-roadmap/ 
  7. https://digitalcooperation.org/report/ 
  8. https://www.un.org/sg/en/content/sg/statement/2020-06-11/secretary-generals-remarks-the-virtual-high-level-event-the-state-of-the-digital-world-and-implementation-of-the-roadmap-for-digital-cooperation-delivered 
  9. http://www.circleid.com/posts/20200613-un-secretary-generals-roadmap-on-digital-cooperation/ 
  10. https://www.un.org/sg/en/content/sg/statement/2020-09-21/secretary-generals-remarks-general-assembly-ceremony-marking-the-75th-anniversary-of-the-united-nations-bilingual-delivered-scroll-down-for-all-english-and-all-french 
  11. http://www.circleid.com/posts/20200107_internet_governance_outlook_2020... 
  12. https://www.foreignaffairs.com/articles/united-states/2019-12-10/chained-globalization 
  13. https://undocs.org/A/75/L.1 
  14. https://www.intgovforum.org/multilingual/content/igf-2018-speech-by-french-president-emmanuel-macron 
  15. https://www.bundesregierung.de/breg-en/news/speech-by-federal-chancellor-dr-angela-merkel-opening-the-14th-annual-meeting-of-the-internet-governance-forum-in-berlin-on-26-november-2019-1701494 
  16. https://ec.europa.eu/commission/presscorner/detail/en/SPEECH_20_165 
  17. https://www.foreignaffairs.com/articles/asia/2020-06-04/lee-hsien-loong-endangered-asian-century 

Written by Wolfgang Kleinwächter, Professor Emeritus at the University of Aarhus

Follow CircleID on Twitter

More under: Internet Governance, Policy & Regulation

Categories: News and Updates

A Failed Whois Policy

Tue, 2020-09-22 23:08

ICANN's two-year effort to purportedly preserve the Whois public directory to the greatest extent possible while complying with GDPR has failed. Under the latest proposal, the Whois database, once a contractually-required directory of domain name registrants, will be gutted to the point of virtual worthlessness, as registrars, registries, academics, and hand-wringing others ignored the public interest and imposed ever-higher barriers to legitimate, GDPR-compliant access to registration data. The world now is nearly completely without a tool necessary to protect against online abuses and safeguard important rights.

At its core, the policy recommendations resulting from this process affords ICANN Org, registrars and registries a place to hide from doing the right thing, as allowed under the GDPR. ICANN Org — the ultimate overseer of the domain name system (DNS) — is now faced with a stark choice: either step up to properly enforce their own contracts to the greatest extent possible while complying with GDPR (as it should as the accrediting body charged with oversight of the DNS for the public interest), or acknowledge this is a matter that should be resolved from outside of ICANN — and be open to national legislation that can do that.

Background

More than two years ago, ICANN faced a real dilemma. Whois was impacted by GDPR and would need to be revamped.

Registries and registrars, maintainers of the Whois system, needed a way to comply with the new law without violating their contracts with ICANN, which required an operating Whois. At the eleventh hour, ICANN issued a temporary contractual specification for registries and registrars (the "temp spec") that allowed them to close down Whois, except in circumstances when legitimate interests needed access to registration records. ICANN then chartered a group of domain name industry participants to supplant the temp spec with a permanent policy that set out new rules for the Whois system that provided legitimate access without violating GDPR. Known as an expedited policy development process (EPDP), the team's work was meant to be thorough but efficient. After two-and-a-half years of deliberations, a final report details its recommendations.

The problem is, of course, the recommendations are quite empty — inadequate on the most basic level. So insufficient are the team's recommendations, in fact, they are unlikely to garner meaningful support as they're considered by ICANN's governing policy council (the Generic Names Support Organization Council) later this week. They simply don't meet the needs of the community — or the public interest Whois is meant to fulfill — and as a result represent a failed policy.

Warnings from Key Stakeholders Ignored

Broader community reactions also have been swift and damning. Governments and security experts, both tasked with advising the ICANN Board of Directors, have condemned the output, while others, including consumer advocates, have joined the chorus. The common refrain is that the recommendations are woefully inadequate:

  • World governments, represented by the Governmental Advisory Committee, warn that "in their current form [certain recommendations] do not strike the appropriate balance between protecting the rights of those providing data to registries and registrars, and protecting the public from harms associated with bad actors seeking to exploit the domain name system."
  • Security experts, represented by the Security and Stability Advisory Committee, denounced the proposals and noted that the process "has not provided outcomes that are reasonably suitable for security and stability."
  • End-users, globally represented by the At Large Advisory Committee, said of the proposed Whois access and disclosure mechanism that "the probability of its meeting the goals needed by the communities whose efforts [they] support will be low."
  • Global business and intellectual property interests, represented by the Business and Intellectual Property Constituencies, said that the Final Report "fails to deliver a System for Standardized Access that meets the needs of its users."

It should be no surprise to anyone that a slapdash policy that is worse than the failed, current policy, is met with disapproval.

Why it Failed

From a high level, there are three main reasons why the policy work failed:

First, it doesn't meet its objective, which was to ensure GDPR compliance while maintaining Whois to the greatest extent possible. Unfortunately, the proposed policy doesn't avail itself of the GDPR's most basic tenets on limitations to scope that would allow reasonable access to large swaths of the Whois database (e.g., legal persons, persons outside the European Union). The EPDP team has, instead, chosen to overapply GDPR, and that fails the Whois availability objective entirely.

Second, the recommendations fall short of public interest needs — needs that were recognized by both the European Commission and ICANN itself. The EPDP team focused more on limiting potential registry and registrar liability instead of addressing the public interest needs of the broader community. The recommendations therefore don't even afford the most basic tools necessary to law enforcement and cybersecurity experts to address their interests, which have been repeatedly relayed to the EPDP team, including by the GAC. Again, this leaves the recommendations unfit for the most basic and widely acknowledged of purposes, particularly in an environment of growing DNS abuse that even ICANN CIIO Ashwin Rangan explicitly stated in an August 5 webinar has been "increasing dramatically." 

Third, operationally, the EPDP team has produced little more than a ticketing system for people to submit data requests. It potentially could ease the intake burden, sure, but provides no substantive benefit regarding the heart of the issue to be resolved — disclosure responses to legally and legitimately based requests for Whois data. Specifically, the proposal does nothing to resolve the underlying decision-making issue: that is, the same 2000+ registrars and registries who are largely denying and/or ignoring legal and legitimate requests today will be the same parties receiving the requests through the fancy ticketing system the policy proposes — and probably will continue to largely deny and ignore. It presumably doesn't take two-and-a-half years of policymaking work to produce an intake system ICANN could have independently designed and built itself.

A healthy multistakeholder model would have delivered a balanced solution to the full community, had it been working by design. However, this policy output is arguably worse for users than is the temp spec itself, which at least was less restrictive and could have offered "reasonable access" to Whois data, had it been properly enforced. Instead, the proposed policy has left the Whois system fractured and not fit for purpose.

What's Next

Those on the side of combating DNS abuse, hunting down criminals, protecting consumers, guarding intellectual property rights, and otherwise making the Internet a safer place must have reasonable access to registration data. Because policy work failed, however, they are now forced to look outside the ICANN process for a workable solution to the data problem and have already begun doing so.

ICANN leadership has responded by going on the attack and defending its multistakeholder model rather than recognize its role in the failure and its fiduciary responsibility to step in and fix the problem. CEO Goran Marby has attempted to spin the failure by stating that "the PDP reached as far as it could," and "we can't do more." In the same breath he indicated that if law enforcement wants access to Whois data — an allowance nearly all in the community support — European data protection authorities will have to update interpretations of the law, or outright change it.

Inexplicably, ICANN Org has now focused its firepower on the GAC. In an antagonistic letter to GAC chair Manal Ismail, ICANN retreats to answering outstanding questions with questions, and obfuscating matters by demanding the GAC further justify its plainly stated concerns over EPDP outcomes with "legal bases." This overtly confrontational letter shows that ICANN, which has been chronically behind the rest of the world in its reaction to and handling of GDPR, continues to take a hard line on exempting themselves from taking positions or responsibility with oversight of the DNS in the public's interest. In this vein, the letter reads more like a trade association that is defending the interests of its members than what is expected from an organization charged with oversight of the DNS in the public interest.

More than ever, it's clear now that this lacking result must be met with assertive action. ICANN should finally act — by saying no to a substandard ticketing system, finding a way to get legal certainty about GDPR application, or even enabling an enforceable code of conduct for registrars as allowed through their existing contracts with registrars. If ICANN compounds policy development failure with its own failure to act, it should expect others in a position to act to take charge. And to do so with the authority ICANN apparently abandoned willingly during this entire misguided process.

Written by Fabricio Vayra, Partner at Perkins Coie LLP

Follow CircleID on Twitter

More under: DNS, Domain Names, ICANN, Internet Governance, Policy & Regulation, Whois

Categories: News and Updates

AI Musings: Using Image Classification Creatively and the Coming Collision in City Centers

Mon, 2020-09-21 16:44

Within this supercycle of digital innovation, we are experiencing globally; AI is revolutionizing business both in predictable and unpredictable ways. With an expectation of more than $20 trillion added to the global economy by 2030, the impact of AI on business is real. While most of the patents and citations are still coming from the US, China is rapidly gaining citations. The Standford AI Index has some compelling data in its 2019 index, but not sure when they will publish the 2020 index. The current index is probably worth a look if we like numbers.

Creative Data Sets

Being immersed in AI now and well into the future, and that the current UC Berkeley School of Information AI Strategies course draws to a conclusion, we wanted to leave our readership with some thoughts on ways diversity of a single classification dataset, when used creatively, can develop abstract, unique and ingestible modeling. Further, as our interaction with devices gets more interactive, our devices will get smarter and more creative. The more we interact explicitly with messaging from Zoom or What's APP (how was the quality of your video content), the smarter the apps will become, which means more upskilling for you in the future.

Let us start with data augmentation, which is different from augmented reality. A personal and realistic approach or example would be a set of pictures classified in computer vision as an Armenian Orthodox Church. Just by turning the picture upside down, we would realize an entirely new and abstract data set. A picture of an Armenian Orthodox Church steeple, right side up, if turned upside down, could also potentially be classified as a standard female sex symbol. The output generated from this abstract data set might be a bevy of sexy female Armenian priests, but you get my drift, assuming specific model parameters were lifted. ComputerVision with classified images is decisive for the creative mind and very different from Augmented Reality; we can identify with other unique data sets just by changing the picture's angle.

Equally important, the arts expand traditional notions of what it means to be human, which are often taken for granted in technological innovation. It is frequently through the arts that we see more expansive, richer understandings of humanity, ones that challenge our enshrined notions of how bodies can or should move, how art is perceived by different humans, look, communicate, or gain accessibility.

"In fact, we are seeing more nuanced opportunities for partnerships between AI and the arts to advance the 'humane' by better representing the fullness of humanity," - Michele Elam

The Collision of Humans and AI in City Centers

AI is eating software. We hear about the global footprint in investments in Autonomous Vehicles, Drugs, Cancer and Therapy, Video Content, Facial Recognition, Fraud Detection, and Finance. However, we do not hear too much about where AI plays an increasingly important role in decision making at the intersection between humans and robots.

We know that city centers will have autonomous vehicles, and there will be no human-driven transit vehicles or cars within these city centers. Autonomous Vehicles will be driven at a maximum speed limit to ensure safety within this 1-2 mile radius of city centers. What is also apparent is that these systems will not be optimal in conjunction with human drivers. We cannot allow a human driver to interact with an autonomous vehicle, especially for pedestrians and occupants' safety in these vehicles optimally. Within these city limits, decisions made by self-driving cars, driven at a maximum speed-limit, will be far different from decisions made by humans. For a human driver to successfully interact with self-driving cars, we have to upskill and future-proof our interactions with these self-driving vehicles and upskill our interaction with all of our devices. Full AI trust cannot be associated with a human because of bias. For safety and efficiency from a transport perspective, decisions made by machines will have one set of biases (albeit built by a human), and humans driving will have another set of bias.) If, for example, you have just one human driving within these city limits, you would already have two data sets.

Finally, the impact this advance will have on individuals is that people will start to get accustomed to interacting with machines as if they were humans. There may even be cases where someone may be surprised or even shocked that the other person at the end of the phone line, with whom a long term dialog has been taking place, is revealed to be a machine and not a human. The more and more AIs become pervasive, the less and less we will recognize them as being something out of the ordinary.

Written by Fred Tabsharani, Digital Marketing Consultant

Follow CircleID on Twitter

More under: Internet of Things

Categories: News and Updates

The Reverse Donut

Fri, 2020-09-18 19:19

A lot of rural areas are going to get fiber over the next five years. This is due to the various large federal grant programs like ReConnect and RDOF. New rural broadband is also coming from the numerous electric cooperatives that have decided to build broadband in the areas where they serve rural electric customers. This is all great news because once a rural area has fiber it ought to be ready for the rest of this century.

These new fiber networks are going to revive and transform many of these areas. People who want to work from home will move to existing rural homes and build new homes. It doesn't take a lot of high-paying jobs to revive a rural economy. Rural communities are also hoping that fiber can slow the drain of people migrating to cities to find work.

However, nothing this transformational is without consequences. I'm already starting to see some of the consequences of what happens when rural areas get fiber, but the towns in a county don't. I've been referring to this phenomenon as the reverse donut, where all of the rural areas around a county seat or mid-sized rural town have fiber but the town doesn't. Today, most rural America has better broadband in towns than in rural areas, and maps of broadband in most counties look like only the donut holes have broadband.

Every broadband grant program in the country is aimed at rural areas that have little or no broadband, and that's how it should be. But I've worked in dozens of counties where everybody just assumes that broadband in towns is okay, particularly if a cable company serves a town.

In many cases, this is not true. Small rural towns often don't have the same quality of broadband as larger towns. DSL in smaller towns is often of the oldest vintage and delivers speeds under 5 Mbps. Small town cable systems often underperform. Such systems might have been built in the 1970s and have been largely neglected since then. Aging and deteriorated coaxial cable performs even more poorly than old telephone copper since the network acts as a huge radio antenna and attracts spectral interference through any open cable splice point. It's also not unusual in smaller communities to find neighborhoods that don't have cable broadband. The houses might have been built at a time when the local cable company didn't have the money to construct new cables.

When my firm helps communities to do speed tests, it's not unusual to find a significant percentage of cable subscribers in small towns with download speed far under 100 Mbps, and in some of the worst cases, under 10 Mbps. It's a big mistake to think that cable company technology translates to good speeds because when the network is poorly maintained, this is often not true. It does no good for a cable company to jam new technology upgrades on top of bad copper.

There are going to be some predictable consequences of communities with the reverse broadband donut. New housing construction is likely to occur outside town instead of inside of in town. That means that over time that the demand for government services will shift. Most counties have geared services like law enforcement, school transportation, trash services, and numerous other government services and programs around serving the county seat. Over time property values in towns will dip compared to newer homes with fiber in rural areas.

This is not to say that any of these changes are bad — but having better broadband in rural areas instead of towns will definitely change communities over time. I talk to people in rural county seats all of the time, and many of them are incredulous that there is no grant money to help them get better broadband. The good news is that it's often possible to build a profitable fiber network in small towns without any need for grants. But that is not going to happen unless towns take a proactive approach to attract an ISP willing to invest in their community or even decide to build their own fiber network.

It's my belief that a county or community is not done the job until everybody has great broadband. Counties that will be getting rural fiber are lucky if their towns already have good broadband. But many counties will look up in a few years and see the consequences of having the reverse donut.

Written by Doug Dawson, President at CCG Consulting

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: News and Updates

The Importance of the Emerging Data Trusts

Fri, 2020-09-18 15:45

I recently followed a webinar session organized by the University of Queensland on the factory of the future. Smart or not, the future will still need factories to make the stuff we humans use every day. One of the discussed questions included: "how will existing production models cope with the staggering and ongoing rate of digital disruption and advanced capabilities?"

Very rapidly, the discussion went into the necessity of good quality data to quickly react to the transformative changes needed in this case regarding manufacturing. COVID-19 was mentioned as perhaps one of the most disruptive events ever happening in our lifetime. The term "pivoting" was used here.

What this means is the companies will have to be able to rapidly react to changes and pivot their organizations to new opportunities and to address the challenges they face. Modern robots and AI software provide technologies that can make this happen. But that depends on access to good quality data.

I wrote of the enormous changes in telehealth, expressing the hope that the government will use its leadership to build on the enormous success of these services during the pandemic. They will only be able to do this if they gather, analyze and use the right data. But as we see with the COVID-19 app, people are very wary of providing their data, out of a fear that the government might misuse it.

We are increasingly facing a key problem for the transformations that our economies and societies need and the use of data is critical in these processes. People have increasingly become more skeptical of what the industry calls "big data." We see both governments and the digital companies creating a surveillance state, and people are resisting this.

At the same time, we see the massive increase in cyber warfare where foreign agents and criminals are providing fake or wrong data to influence politics, business, and society to disrupt and create fear, uncertainty, and doubt.

These bodies do get what they want as governments now go overboard with putting restrictions on the free flow of information that is needed to support a values-based, democratic society.

Data is caught in the middle of this. It has become the critical conduit to run our economies and societies; it is taking on the same role as oil and electricity. It is rapidly becoming a national utility.

To allow data to take up that role, we need to change how data is gathered, analyzed and used radically. This has led to the development of the concept of data trusts. This takes the concept of a legal trust and applies it to data.

As in a normal trust, the data trust holds something, and the trustee makes decisions about its use. It is a legal structure that provides independent stewardship of data for the benefit of people, both in a social and economic sense. Data trusts are specifically very useful when sensitive data is involved.

It is important to see the incredibly fast-moving digital revolution unfolding before our eyes, not as a technology event but as an event in our human evolution. So, we do need both STEM and humanities involved in this process.

Ethics is increasingly becoming more and more important to guide the human face of this. For that reason, it is sad that the government has downgraded the role of humanities studies in our university system. Data trusts need to be seen in that context.

For example, in the case of Covid-19, we could develop "data trusts for good." This could be the most effective way to overcome the lack of sufficient data to tackle the COVID-19 crisis without creating a surveillance state in the process.

The Atlantic Council's GeoTech Center is one of the leading organizations working on policies and strategies aimed at balancing data privacy against utility within the bounds of the social contracts of liberal democracies. They acknowledge a need to establish a public trust that assists nations recovering from the pandemic.

They see COVID-19 as a key opportunity to launch such a concept. Developing a data trust that will assist nations recovering from the pandemic. If this framework could legislate accountability, be designed for privacy, and operate with transparency, it could provide the solutions needed to arrive at a "new normal."

The concept of data trusts would also be ideal for smart cities. Cities are facing the same issues with their data collection, and an independent data trust, overseen by parties that include citizen representation, could greatly address the reluctance of people providing personal data for the common good.

Once we have some of these data trusts up and running, we can learn from them, and if successful, we can use them to build up trust again with those government and commercial organizations who will put their data in such a trust.

As we are facing more complex problems increasingly, there is no question about the fact that we need data, AI, machine learning, and algorithms as tools to manage our cities, companies, countries and indeed, our world. Rather than procrastinating and resisting the economic and social transformations that are needed, governments should take a leadership role and start working on solutions.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Coronavirus, Data Center, Internet Governance, Policy & Regulation

Categories: News and Updates

Bill Gates Has Not Forgotten Teledesic

Thu, 2020-09-17 20:07

Might we see another broadband LEO constellation?

Teledesic was the first company to plan to offer broadband connectivity using a constellation of low-earth-orbit (LEO) satellites. Craig McCaw, who had sold McCaw Cellular to AT&T, founded Teledesic in 1990 and it got a big visibility and credibility boost when Bill Gates made a small ($5 million) investment in the company.

McCaw and Gates were able to attract capital — $200 million from a Saudi Prince, $750 million from Motorola, and $100 million from Boeing, which signed on as the prime contractor. When Boeing and Teledesic finished the final design, the constellation had been reduced from the originally planned 840 to 288 satellites. (Later, Motorola replaced Boeing as prime contractor). The FCC approved Teledesic's Ka-band spectrum application in March 1997 and 37 counties submitted supporting proposals for the December 1997 World Radiocommunication Conference. Teledesic hoped to provide "fiber-like" connectivity to an "Internet in the sky," but was unable to deliver and gave up in 2002.

I don't know what motivated Gates' investment in Teledesic, but today the Gates Foundation is devoted to fighting poverty and providing health care and education in developing nations. Nearly 20 years after the demise of Teledesic, satellite, launch, and communication technology are vastly improved, the entire world is aware of the Internet, we have applications that can utilize "fiber-like" speed and latency, and Gates is clearly aware of the value (and downside) of connecting the unconnected.

Bill Gates might be thinking that it is time for another try.

Last September, Microsoft announced that customers of Viasat, Intelsat, and SES would be able to access Azure cloud services. Their focus is on government, enterprise, maritime, and airline applications and the announcement states that "each of the partners brings different strengths, for example, choices between Geostationary (GEO), Medium Earth Orbit (MEO) and, in the future, Low Earth Orbit(LEO) satellites" so it seems they are talking with possible LEO partners. (Maybe not with Amazon given its recent challenge to Microsoft's JEDI defense contract).

Earlier this month, the FCC authorized Microsoft to establish a proof-of-concept connection between two ground stations in Washington and DEIMOS-2, a Spanish imaging satellite. If successful, the test will demonstrate satellite connectivity to Microsoft's Azure cloud services as well as the rest of the Internet. They plan to run the demonstrations before, during, and after the Ignite conference, which starts on Sept. 22, and if the demonstration results in significant market interest, they will apply for regular ground-station authority, which would put them in direct competition with Amazon's ground-station service. (Microsoft may have a fear of missing out on space).

Terminals with electronically steerable antennas are a critical LEO broadband component. High-end fixed and mobile users will be able to justify relatively expensive terminals, but success in the consumer market will require user-installed, reliable, low-cost terminals. It turns out that Bill Gates was the lead investor in electronically-steerable antenna manufacturer Kymeta at the time of its launch in 2012 and he is now leading a new $85 million investment round in support of a new high-end mobile service using Kymeta's new LEO-ready U8 terminal. The expensive U8 is sold for high-end fixed and mobile applications today, but they will surely be able to produce a low-cost fixed-service terminal in the future.

If the LEO broadband business case turns out to be viable, these are early days and there is room for competitors. The Gates Foundation endowment is nearly $50 billion, Bill Gate's net worth is $115 billion and Microsoft is on a roll. Might we see another broadband LEO constellation?

Written by Larry Press, Professor of Information Systems at California State University

Follow CircleID on Twitter

More under: Access Providers, Broadband, Mobile Internet, Telecom, Wireless

Categories: News and Updates

Maximizing Qname Minimization: A New Chapter in DNS Protocol Evolution

Wed, 2020-09-16 23:39

Data privacy and security experts tell us that applying the "need to know” principle enhances privacy and security, because it reduces the amount of information potentially disclosed to a service provider — or to other parties — to the minimum the service provider requires to perform a service. This principle is at the heart of qname minimization, a technique described in RFC 7816 that has now achieved significant adoption in the DNS.

Qname minimization alters the process of DNS resolution by limiting the content of DNS queries to the minimum that a name server needs to do its "first job" of referring requesters to the next name server during an iterative resolution process.

For instance, if the resolver is looking for the Internet Protocol (IP) address of the web server "www.example.TLD" — a three-label domain name — the resolver would send a request for a referral to the authoritative name server for ".TLD" — just one label — to the root server. The name server would return the referral: "ask TLD".

The resolver would then send a request for information about the name server for "example.TLD" — just two labels — to the TLD name server. The TLD name server would again return the appropriate referral: "ask SLD".

Finally, the resolver would send the full domain name — all three labels, in this case — to the SLD name server. The SLD name server, as in 'ordinary' DNS resolution, would return the DNS record of interest.

This minimized approach, illustrated in the figure below, is a simple but innovative step in the evolution of DNS protocol implementation. It's a fundamental way to reduce the sensitivity of DNS data exchanged between resolvers and root and TLD servers (as well as any other name servers prior to the final one in the chain).

Resolver-to-authoritative DNS traffic represents the aggregate DNS lookup requirements of all clients of the resolver and is therefore less sensitive in its particulars than client-to-resolver DNS traffic. This inherent property of DNS recursive resolution, when coupled with qname minimization, enhances privacy and security as responses to minimized queries meld into the resolver's cache. Ultimately it reduces the number of DNS queries sent "across the wire" while also removing visibility of the full domain name at various levels of the DNS hierarchy.

The risk/benefit tradeoff for qname minimization is therefore very attractive: a simple change to the resolver side yields a significant reduction in DNS data disclosed to the authoritative, which in turn yields enhanced privacy and security benefits at the root and TLD levels that some techniques, including DNS encryption, do not.

Qname-minimized DNS traffic from resolvers is inherently less sensitive, particularly at the root and TLD levels. Proposals to add DNS encryption at these levels of the ecosystem, meanwhile, would require a complex change and add operational risk to both resolvers and authoritative name servers. While encryption would yield a relatively modest benefit against outside observers, without qname minimization the full domain name would still be sent to each name server in the chain — more than the minimum required, and more than the "need to know."

(Note that while there are clear privacy and security benefits from qname minimization, the DNS community will have to also consider the operational tradeoffs that arise as result of the wide-scale adoption of the technique, as the reduction in traffic content may impair root and TLD servers' ability to do their traditional "second job" of providing visibility into name collisions and other security threats to the DNS.)

The Internet Engineering Task Force began specifying qname minimization in 2014 and published RFC 7816 as an Experimental RFC in 2016. To further support these efforts, in 2015, Verisign announced a royalty-free license to its qname minimization intellectual property rights. The progress since that time has been encouraging, particularly with implementation support and operational deployment.

Open source resolvers such as BIND, Knot and Unbound have all implemented, tested, and enabled by default qname minimization functionality.

A 2019 paper at the Passive and Active Measurement (PAM) conference reported on an initial study of qname minimization deployment. The researchers' measurements via RIPE Atlas showed the adoption of qname minimization by resolvers quickly growing from 0.7% in May 2017 to 16.8% in April 2018. Their investigation of DNS traffic at the K Root and .NL name servers also confirmed that qname minimization has had an observable impact on traffic seen at authoritative name servers. For instance, the fraction of queries to the .NL name servers for names with no more than two labels increased from 33% to 44% over the same time span.

More recent measurements reported at an IETF meeting in April 2020 via the RIPE Atlas platform show that 47% of probes were utilizing qname-minimizing resolvers. As of August 2020, the fraction had increased to 55%, according to latest statistics in the same data collection hosted by NLnet Labs.

How has qname minimization impacted DNS traffic at Verisign's authoritative name servers? The figures below show the overall trend of label size distributions observed at Verisign's A and J Root servers as well as the .COM and .NET name servers.

These observations demonstrate a shift in the statistical distribution of the number of labels that is consistent with the increased deployment of qname minimization for queries sent to the root and TLD levels of the DNS hierarchy.

In January 2018, 32% of queries received at the A and J Root servers contained only one label, while 30% of queries received at the .COM and .NET name servers consisted of only two labels.

As of August 2020, those measures have increased to an impressive 53% and 49%, respectively — in just a few short years, over half of all queries received at .COM are utilizing this easy and effective security and privacy enhancement!

We look forward to several next steps including further adoption, additional research results on the impact of qname minimization, and the evolution of RFC 7816 from an experimental track document to a standard (rfc7816bis) that reflects this new chapter in DNS protocol deployment.


Written by Matt Thomas, Distinguished Engineer at Verisign

Follow CircleID on Twitter

More under: DNS, DNS Security, Domain Names, Internet Protocol

Categories: News and Updates

The Suitable Defendant Rule: In Rem Jurisdiction under the ACPA

Mon, 2020-09-14 19:47

The Anticybersquatting Consumer Protection Act (ACPA) creates two distinct avenues by which mark owners may seek a remedy for cybersquatting. A person who is a suitable defendant under 15 U.S.C. §1125(d)(1)(A) is one over whom the court has in personam jurisdiction. However, if the mark owner is "not able to obtain in personam jurisdiction over a person who would have been a defendant" in the ACPA action, then "[t]he [mark] owner may file an in rem civil action against a domain name in the judicial district in which the domain name registrar [ ] [or] domain name registry . . . is located. §1125(d)(2)(A). An in rem civil action is against the domain name. The issue is whether an "unsuitable defendant" can oust a pending in rem civil for an in personam litigation and under what circumstances. What kind of gamesmanship is going on?

In a very recent decision, Prudential Ins. Co. of America v. Pru.com, 20-cv-00450 (E.D. Va, Alexandria Division, July 22, 2020) the Court was presented with a first impression issue: the domain name holder who was not otherwise a "suitable defendant" under Section 1 of the ACPA appeared after the commencement of the in rem civil action and moved to dismiss the complaint on the theory that it consented in the registrar agreement to in personam jurisdiction in Arizona, the seat of the registrar. The unsuitable defendant's theory rests on what under the Uniform Domain Name Dispute Resolution Policy is called "mutual jurisdiction." This, of course, presupposes a UDRP award transferring the domain name to the mark owner, which never happened because although a UDRP complaint was commenced, it was withdrawn in favor of the in rem civil action. The withdrawal was probably strategic, which I'll comment on in closing this piece.

The issue before the Court is whether an otherwise unsuitable defendant can affect the course of the litigation. The Court rejected the foreign registrant's argument in the following strong language:

To [accept foreign registrant's argument] would incentivize foreign registrant defendants to engage in gamesmanship and forum shopping by consenting to jurisdiction in a forum of their choice at a time of their choosing after litigation has commenced against them. Such gamesmanship would only serve to incur additional litigation costs on trademark owners. The ACPA does not permit such an absurd result. (Emphasis added)

By what route of reasoning did the court reach the conclusion that "[t]he ACPA does not permit such an absurd result?

The Court first of all refers to an earlier case decided by the 4th Circuit in 2002, Porsche Cars N. Am., Inc. v. Porsche.net, 302 F.3d 248 (4th Cir. 2002). In that case, the foreign registrant of several domain names decided to submit to personal jurisdiction in the district court for the Southern District of New York "three days prior to the scheduled trail date for the in rem action against the domain names in the District Court for the Eastern District of Virginia." The Court rejected the argument by the foreign registrant in Porsche but the Prudential Court concluded that the Fourth Circuit "did not decide when personal jurisdiction over a registrant must exist in order to preclude in rem jurisdiction over the domain name under Option 1" (emphasis on "when" is in the decision).

So the issue is when is a party precluded from affecting the course of the litigation? The answer rests on the court's formulation of a "suitable defendant" rule. Since the foreign registrant was not a suitable defendant when the mark owner commenced the in rem civil action, it could not make itself suitable by consenting to in personam jurisdiction. Where "a trademark owner has exercised due diligence and determined that any would-be defendant is a foreign person that is not subject to the personal jurisdiction in the United States . . . 'the jurisdiction of the court depends upon the state of things at the time of the action brought, and that after vesting, it cannot be ousted by subsequent events,'" citing a case from 1824 (yes, 1824!), Mollan v. Torrance, 9 Wheat. 537, 539).

We will hear more about this case because although unsuitable as an in personam defendant, the domain name holder can still defend itself on the theory that it registered <pru.com> lawfully. The Court filed an Order on September 11, 2020 that denied plaintiff's motion for default judgment (it set aside the default of record) and granted defendant's motion for leave to file an answer out of time.

While it may be curious as to why the domain name registrant wanted to appear personally and be subject to in personam jurisdiction, the fact is that this case presents an important issue for domain name investors. What possible right could Prudential have to <pru.com>? Well, it has a bunch of "pru+" trademarks and is alleging rights to "pru" alone (family of marks argument), which, of course, is the reason to follow closely the next act in this case. The basic fact is that Prudential surreptitiously sought to acquire the domain name after learning it was being offered for sale, but when it learned the demand price was in the six figures, it withdrew the UDRP complaint after the domain registrant responded with a defense of lawful registration.

One can easily draw the inference that Prudential withdrew the UDRP complaint because the Respondent offered a sufficient rebuttal that jeopardized prevailing in that forum. Let's see how that "rebuttal" when it comes tallies up in federal court! The E.D. Virginia, Alexandria Division, though, has not been kind to investors whose domain names are identical or confusingly similar to trademarks, most recently Code-To-Learn Foundation D/B/A Scratch Foundation v. scratch.org, 1:19-cv-0006 (2019) even though the domain name was registered before the existence of the mark.

Written by Gerald M. Levine, Intellectual Property, Arbitrator/Mediator at Levine Samuel LLP

Follow CircleID on Twitter

More under: Domain Management, Domain Names, Brand Protection, UDRP

Categories: News and Updates

A New Fiber Optic Speed Record

Thu, 2020-09-10 19:34

Dr Lidia Galdino, lead researcher on the study. (Photo: UCL)

Researchers at University College London (UCL) have set a new bandwidth record for fiber optic bandwidth transmission. They've been able to communicate through a fiber optic cable at over 178 terabits per second, or 178,000 gigabits per second. The research was done in collaboration with fiber optic firms Xtera and KDDI Research. The press release of the achieved speed claims this is 20% faster than the previously highest achieved speed.

The achieved speed has almost reached the Shannon limit, which defines the maximum amount of error-free data that can be sent over a communications channel. Perhaps the most impressive thing about the announcement was that UCL scientists achieved this speed over existing fiber optic cables and didn't use pristine fiber installed in a laboratory.

The fast signal throughput was achieved by combining several techniques. First, the lasers use raman amplification, which involves injecting photons of lower energy into a high-frequency photon stream. This produces predictable photon scattering, which can be tailored to the characteristics needed for optimally traveling through glass fiber.

The researchers also used Erbium-doped fiber amplifiers. To those who have forgotten the periodic table, erbium is a commonly found metal in nature with an atomic weight of 68. Erbium has a key characteristic needed for fiber optic amplifiers in that the metal efficiently amplifies light in the wavelengths used by fiber-optic lasers.

Finally, the amplifiers used for the fast speeds used semiconductor optical amplifiers (SOA). These are diodes that have been treated with anti-reflection coatings so that the laser light signal can pass through with the least amount of scattering. The net result of all of these techniques is that the scientists were able to reduce the amount of light that is scattered during the transmission through a glass fiber cable, thus maximizing data throughput.

UCL also used a wider range of wavelengths than are normally used in fiber optics. Most fiber optic transmission technologies create empty buffers around each light bandwidth being used (much like we do with radio transmissions). The UCL scientists used all of the spectrum, without separation bands, and used several techniques to minimize interference between bands of light.

This short description of the technology being used is not meant to intimidate a non-technical reader, but rather show the level of complexity in today's fiber-optic technology. It's a technology that we all take for granted, but which is far more complex than most people realize. Fiber optic technology might be the most lab-driven technology in daily use since the technology came from research labs, and scientists have been steadily improving the technology for decades.

We're not going to see multi-terabit lasers in regular use in our networks anytime soon, and that's not the purpose of this kind of research. UCL says that the most immediate benefit of their research is that they can use some of these same techniques to improve the efficiency of existing fiber repeaters.

Depending upon the kind of glass being used and the spectrum utilized, current long-haul fiber technology requires having the signals amplified every 25 to 60 miles. That means a lot of amplifiers are needed for long-haul fiber routes between cities. Without amplification, the laser light signals get scattered to the point where they can't be interpreted at the receiving end of the light transmission. As implied by their name, amplifiers boost the power of light signals, but their more important function is to reorder the light signals into the right format to keep the signal coherent.

Each amplification site adds to the latency in long-haul fiber routes since fibers must be spliced into amplifiers and passed through the amplifier electronics. The amplification process also introduces errors into the data stream, meaning some data has to be sent a second time. Each amplifier site must also be in powered and housed in a cooled hut or building. Reducing the number of amplifier sites would reduce the cost and the power requirement and increase the efficiency of long-haul fiber.

Written by Doug Dawson, President at CCG Consulting

Follow CircleID on Twitter

More under: Access Providers, Broadband, Telecom

Categories: News and Updates

A Responsible Domain Industry Needs a Responsible Registrant Appeals Process

Thu, 2020-09-10 18:37

As the steward of .ORG, Public Interest Registry is committed to serving as an "exemplary registry" for the DNS. As part of that mission, PIR published our Anti-Abuse Principles last year that serve as our north star to address questions of abuse.

As PIR has stated on many occasions, generally speaking, the DNS is not the appropriate place to address questions of website content abuse because of the blunt tool we as a registry have and the collateral damage that can be caused by suspending a domain name for a piece of content. That is why our Anti-Abuse Policy primarily focuses on DNS Abuse rather than website content abuse. That said, there are limited instances of website content that can be so egregious that they require action at the DNS. In 2019, we suspended thousands of domains related to DNS Abuse, but suspended only eleven over website content (8 for distribution of Child Sexual Abuse Materials and 3 for being dedicated to distribution of opioids online). One of the cornerstones of our Anti-Abuse Principles is the commitment to due process.

We do not want that commitment to just be lip service, so we are offering a two-step process to address questions of our anti-abuse efforts. Any registrant may contact PIR if he or she thinks we made a mistake in taking action under our Anti-Abuse Policy. Through this informal process, which has no fee, we have reversed several suspensions when presented with new information or information that a domain name was compromised at the time it engaged in DNS Abuse. This free review is built in as a prerequisite to the Appeal process so registrants may have the suspension reversed prior to incurring any cost associated with the formal registrant Appeals process.

Beyond that informal step, PIR is instituting a new solution that creates new rights for .ORG registrants: The right to appeal a suspension under our Anti-Abuse Policy to a neutral third party. This appeal mechanism will be administered by Forum, previously the National Arbitration Forum. While Forum charges $1,200 per case, PIR will subsidize $700 and then reimburse the other $500 if the appeal is successful.

In formulating this neutral-party appeals process, PIR sought out a number of perspectives and voices, from our Advisory Council to Article19 and the Danish Institute for Human Rights to the members of several ICANN constituencies (including the ALAC, BC, GAC, NCUC and the Registrar Stakeholder Group) and the Internet and Jurisdiction (I&J) Policy Network. The creation of this Appeals process is consistent with both the guidance I&J has offered in the past, encouraging contracted parties to offer public appeal mechanisms, and the recommendations of Article19 and the Danish Institute for Human Rights which urged a redress mechanism for registrants.

We know that with every new mechanism like this, there will be a spectrum of views. We welcome it and are proud of what we are creating here. We listened to the feedback we received and made improvements to the process as a result, including making it more digestible by publishing the two Appeal process companion documents: the FAQs and visual timeline of the process.

Make no mistake, launching this mechanism does not mean that PIR is expanding its Anti-Abuse Policy and that we are going to start acting on more "content" referrals. This simply creates a new right for registrants to have a neutral party hear their appeal for a suspension for abuse.

PIR is focused on anti-abuse because we know the impact of a healthy domain industry: the introduction of more and stronger mission-driven organizations and people devoting their efforts to make their communities, and therefore the world, a better place. PIR believes with our anti-abuse principles combined with a timely and transparent process for registrants to appeal suspensions, we have found the delicate balance that can give the community confidence in a safer, strong DNS.

Written by Brian Cimbolic, Vice President, General Counsel at PIR

Follow CircleID on Twitter

More under: Cybersecurity, Domain Management, DNS, Domain Names, ICANN, Internet Governance, Policy & Regulation, Registry Services

Categories: News and Updates

Defining KPIs to Measure the Success of Your AI Strategy

Tue, 2020-09-08 21:49

KPIs are industry-specific and should be aligned carefully with your AI strategy. My course at UC Berkeley drills down heavily on how to define success when implementing your AI strategy, and measurement, like anything else, is the top priority.

One technique that can potentially be used in your organization as you embark on your AI strategy is to use the SMART method for KPIs. SMART stands for Specific, Measurable, Attainable, Relevant, and Time-Sensitive. It's essential to understand the differences between a goal and a KPI, which measures the success and milestones and not the outcome or goal attained. It's critical to rank the KPIs in terms of importance and be focus on a small subset of KPIs that will align to your most important outcomes. If your focus is too broad, then chances are your AI strategy portfolio can get out of balance. I'll share what I mean about the portfolio shortly.

Let's take a SaaS company that wants to reduce churn, for example. The goal is to implement AI in a way to reduce the churn rate by identifying a plan. Initially, let's be "specific" and say we want to reduce churn by 10% over the next QTR. Now that the specific goal has been identified, we can focus on measurement. One KPI, for example, in the initial month of the quarter, is to maximize retention rate. What can the company do to minimize churn and maximize retention in the first quarter? A brainstorming session ensues, and you focus on three KPIs for the 1st quarter. Perhaps one KPI will focus on the success of a promotion intended to a subset of subscribers initially. Based on that success, you might want to extend the same offer to an increasing number of targeted visitors, offering an annual subscription at a reduced rate 50% off. If AI is used correctly, it can help you discover and target the promotion rate, 50% or 60%, the specific price point that your subscribers will happily renew, based on the velocity of subscribers renewed in a given hour, day, week, and of course predict the percent of subscribers that will restart/renew in a given month.

The third aspect of measuring your success for this program is the fact that it was attainable. This is critical for the maturity of the data set. AI will build upon the success of your current retention rate to discover more ways to keep your clients subscribed. If you reach too far with AI and, for example, introduce 3rd party data sets, your specific goal might be threatened by various factors. Be careful of 3rd party datasets as you present them to your internal datasets before implementing the AI strategy. Another threat that could derail your AI strategy is the portfolio model. Your AI strategy is your asset basket. It's essential to have a balance on which AI strategy is most important in your organization and rebalance after each successful implementation to build on that success. For example, it's not prudent to implement an AI strategy across all departments of your organization until you have had many successes. So, once again, focus on small wins before taking on new implementations.

KPIs that are relevant is critical to the success of your AI strategy. Churn rate and retention rate are applicable. What was the critical success factor? Was it the offer of 50% an annual subscription? Or was it the timeliness of the message? Perhaps it was the timeliness of the message? Or was it the type of message? Did the prospect engage in chat commerce, or was it through the old-fashioned email promotion? These are all relevant KPIs that are measuring sticks for how well the AI succeeded. It will learn from and build upon for the next set of goals for your company.

Written by Fred Tabsharani, Digital Marketing Consultant

Follow CircleID on Twitter

More under: Data Center, Web

Categories: News and Updates

Only Bad Actors Should Worry About the URS

Tue, 2020-09-08 20:18

Co-authored by David McAuley, Policy Manager at Verisign and Griffin Barnett, Associate at Winterfeldt IP Group.

With DNS abuse a topic of increased concern throughout the community, any controversy over adopting the Uniform Rapid Suspension System (URS) for all generic top-level domains (gTLDs) seems misplaced.

The URS was designed as a narrow supplement to the Uniform Domain-Name Dispute Resolution Policy (UDRP), applicable only in certain tightly defined circumstances of clear-cut and incontrovertible trademark infringement involving the registration and use of a domain name. The ICANN Rights Protection Mechanism (RPM) Review Working Group has closely examined the URS over the past four years, and so far has consistently concluded it to be fit for purpose in quickly disabling the most egregious instances of bad faith registration and use of a domain name that takes unfair advantage of an established trademark.

A comprehensive January 2018 RPM Review Working Group analysis of all URS cases filed to that point demonstrated that it had been an effective tool against blatant cybersquatting and associated DNS abuse (e.g. phishing schemes leveraging a trademark-infringing domain name), with 827 cases brought against 1,861 domains and an average case duration of just 17 days. Nothing in the data, and nothing in the Working Group deliberations, has pointed toward any significant abuse of the URS (for instance, by overly aggressive or overreaching brand owners). In fact, the data suggest just the opposite, namely that brand owners have turned to the URS in only the very limited subset of instances where it is specifically intended to be used — i.e. to quickly suspend a domain name that, by clear and convincing evidence, is being used to infringe the complainant's trademark to perpetrate consumer harm. Indeed, although the URS includes specific penalties for abusive complainants, no such abuses of the URS have been found to date.

The 2018 analysis also showed that the URS is providing meaningful due process for registrants, even in default cases, as well as a demonstrably effective appeals process. Nonetheless, to further improve the efficacy of the URS, the Working Group appears on track to recommend certain process improvements. These include enhanced notice requirements to ensure registrants receive proper notice of URS proceedings instituted against them, more clear language requirements to ensure proceedings are conducted in the registrant's language, and making available additional information about the URS through ICANN and URS providers' websites to help participants in the URS better understand the meaning of the proceedings and how to participate fairly.

In short, the case for adopting the URS in all gTLDs is clear — so clear that only those with something to gain from cybersquatting and other DNS abuses that leverage trademarks should worry about it being applied in those gTLDs where it has not yet been implemented. Otherwise, it seems obvious that those who hope for a cleaner DNS would support the adoption of the URS as an ICANN Consensus Policy applicable to all gTLDs.

Written by David McAuley, Senior International Policy Manager, Naming and Registry Services at Verisign

Follow CircleID on Twitter

More under: Cybersecurity, Domain Management, DNS, Domain Names, ICANN, Brand Protection, Internet Governance, New TLDs, UDRP

Categories: News and Updates

New Digital Services Act Should Not Disrupt Internet's Technical Operations, Warn RIPE NCC, CENTR

Tue, 2020-09-08 19:30

RIPE NCC and CENTR have released a statement today in response to the upcoming European Commission's Digital Services Act, urging policymakers to distinguish between the Internet's core infrastructure and operations. The new Digital Services Act is considered a significant piece of regulation — an update to the E-Commerce Directive of 2000 — that will provide the legal framework for regulating digital services in the EU and sets out the liability regime for "information society service" providers. Both the RIPE NCC and CENTR are urging policymakers to make a distinction between the Internet's core infrastructure and operations and to protect the core infrastructure from unnecessary and disproportionate intervention. Read the full release here.

Follow CircleID on Twitter

More under: Access Providers, DNS, Domain Names, Internet Governance, Internet Protocol, Law, Policy & Regulation

Categories: News and Updates

TLD Maintenance Significantly Improved With the New Registry Maintenance Notifications for EPP

Tue, 2020-09-08 16:56

Three years ago, the first Internet-Draft on Registry Maintenance Notifications for the Extensible Provisioning Protocol (EPP) was published, which will become a Request for Comments (RFC).

The IETF Registration Protocols Extensions (REGEXT) working group is the home of the coordination effort for standards track EPP extensions. They released eight RFCs over the last couple of years, and they are currently working on more than 15 Internet-Drafts.

Let's take a minute to discuss this with the authors of the new document.

For those who don't know you: Who are you, and what do you do?
Jody Kolker: I am a Senior Director in the domains organization at GoDaddy. I work with our domains engineering team to onboard and implement new gTLDs. I act as a liaison between our engineering teams and registry product and technical teams. I am also active in the CPH TechOps group at ICANN and the IETF REGEXT group.

Roger Carney: I am a Director of Policy at GoDaddy. I focus on domain name policy creation undertaken in the ICANN community, with a specific awareness of the engineering challenges presented by this policy work at ICANN. That leads me to be involved in many community groups (GNSO, CPH TechOps, IETF, etc.) to help provide solutions for the domain name ecosystem.

Tobias Sattler: I am a member of the board and CTO of united-domains and one of this document's authors. I was the Vice-Chair/CTO of the ICANN Registrar Stakeholder Group and Founder & Co-Chair of the CPH TechOps group.

What exactly is the Registry Maintenance Notification?

Tobias Sattler: The Registry Maintenance Notifications for the Extensible Provisioning Protocol (EPP) enables domain name registries to schedule their maintenance windows in a standardized format. That allows domain name registrars to prepare their systems efficiently and adjust their operational plans.

How did you get the idea to write this RFC?

Tobias Sattler: The idea started a couple of years ago. At the time, ICANN collaboration focused on policy work only. As a result, we decided at the ICANN Registrar Stakeholder Group to create a think tank that addresses technical and operational needs and founded TechOps.

Right at the beginning, we created a backlog with priorities. Many challenges of the new gTLDs influenced this backlog. Given the DNS namespace expansion, it was hard to keep track of all maintenance information sent to domain name registrars. Back then, this information was transmitted in different formats. Someone had to review this information and decide what to do next. Our goal was to improve and standardize this process, in addition to saving resources.

When TechOps prioritized the backlog, the Registry Maintenance Notifications was assigned the highest priority. We planned to publish an issue paper, discuss this with the group for a couple of weeks, and then see what we could do. It was evident that we needed a standardized mechanism. Since our industry uses EPP, it was apparent to choose an EPP extension.

However, this led to new considerations: If and under which conditions everyone would implement this EPP extension. We chose an RFC on standard tracks to ensure acceptance. After submitting the Internet-Draft, we asked the IETF REGEXT working group for adoption.

What is the IETF REGEXT working group?

Jody Kolker: The IETF REGEXT working group establishes standards for the extensions to the Extensible Provisioning Protocol, which is the domain name provisioning protocol for top-level domain name registries. The working group also sets standards for the RDAP protocol.

Tobias Sattler: If you are interested in contributing to this working group, please feel free to join.

What is an RFC? Can you please explain it a bit more?

Jody Kolker: An RFC (Request for Comments) is a publication developed by engineers and individuals who describe either a best practice or a standard protocol for how systems communicate.

What is the process of publishing an RFC?

Jody Kolker: A proposed Internet-Draft is written and presented to the applicable working group within IETF. The working group will then choose to adopt the document for review. The paper goes through a thorough iterative review process with community technical experts. Once the draft reaches stability (experts have no more edits), it then gets approved by the working group last call. The document is then sent to the Internet Engineering Steering Group (IESG) for review. Once the IESG has finished its review (including any agreed to edits) and approves the draft, it is sent to the RFC editor for one last round of comments and fixes. Once these fixes are applied, it is published as an RFC.

How long will it still take for this document to become an RFC?

Jody Kolker: As this draft is getting ready for the Working Group Last Call, it typically takes six to twelve weeks to complete the remaining work items on its way to becoming an RFC.

What about the implementation? Is there already someone who uses your extension?

Tobias Sattler: Neustar already started implementing the new EPP extension before they became GoDaddy Registry. We are looking forward to seeing more domain name registries implement the extension.

Do you already have other ideas, or are you already working on other RFCs?

Tobias Sattler: I am not working on other Internet-Drafts, nor do I plan to submit a new one anytime soon. We will see what happens next, but I am open to new things.

Jody Kolker: I'm currently engaged in another REGEXT draft to provide better security for transferring domains between registrars.

Roger Carney: As Tobias mentioned earlier, the REGEXT WG has a steady list of work ahead, the three of us are not authors on those drafts, but we will be participating in their review, development, and eventual standardization.

What are your plans for the future?

Tobias Sattler: I am happy that this document will be published as an RFC. Although we still have to promote RFC adoption, I think the benefits speak for themselves. Apart from that, I will continue to support TechOps and the IETF REGEXT working group.

Jody Kolker: I'll continue participating in TechOps and REGEXT working groups, helping drive innovation where I can. I'll also continue to push for the adoption of Registry Maintenance Notification.

Roger Carney: As Tobias mentions, there is still a lot of socializing of this draft/standard so that we can get wide adoption and achieve the most significant benefits. Aside from promoting the Registry Maintenance Notifications, continue pushing forward and help out where needed.

Thank you for your time and insight!

Written by Tobias Sattler, CTO at united-domains

Follow CircleID on Twitter

More under: Domain Names, ICANN, Internet Protocol, Policy & Regulation, Registry Services, New TLDs

Categories: News and Updates

How Will the New .AU Domain Licensing Rules Impact You?

Thu, 2020-09-03 19:46

The .AU Domain Administration (auDA) will soon implement new .AU domain administration licensing rules either late this year or early next year. These rules apply to new registrations and around 3 million existing domain names in the com.au, net.au, org.au, and more .AU namespaces.

What's changing?

Previously, an Australian trademark application or registration may constitute the required Australian presence for an .AU domain name, but the domain name need not match the trademark.

Under the new rules, a domain name registration based on an Australian trademark must exactly match the trademark. That is, the domain name is identical to and is of the same order as the words that are the subject of an Australian trademark application or registration, excluding domain name system (DNS) identifiers such as com.au, punctuation marks, articles such as "a," "the," "and," and "of," and "&."

Trademark exampleExample of an exact match domain nameTweedledee & Tweedledumtweedledeetweedledum.com.auThe Frog Prince!frogprince.org.aubeesknees.combeesknees.net.au

What's the impact?

Non-compliant domain names must have their errors corrected before they come up for renewal any time after the new rules are implemented.

Failing to comply could mean that auDA or the managing registrar could suspend or cancel the non-compliant domain. Once a domain name is canceled, it may not be transferred or renewed. It will be purged from the registry records and made available for registration by the general public.

Before deciding that it might be okay to purge a few domains, be cautious that lapsed or abandoned domain names  carry a footprint of digital activity that can be leveraged as an attack vector or cause disruption to a virtual private network (VPN), voice-over IP (VoIP), website, services, servers, network or email, and a host of other dependencies.

As a rule of thumb for the domains you want to maintain, all domain registrants are obligated to keep their domain information complete, true, and accurate throughout the lifetime of their domain names. Similarly, all .AU domain registrants are continually required to have a valid Australian presence and satisfy any eligibility and allocation criteria for the namespace being applied for.

What to look out for, and why it's important

Registrant contacts and entity information changed over time may not have been applied to domain portfolios, and this licensing rules change presents a great opportunity for domain registrants to ensure their domain information is up-to-date.

First, run an audit of all .AU domain names, then update non-compliant .AU domain names.

Additionally, this domain information should be reviewed thoroughly:

  1. The business registration number (ABN or ACN for example) where applicable
  2. Registrant contact information
  3. Technical and administrator contact information
  4. WHOIS data
  5. All other data in registry records

Cleaning up your .AU domain portfolio not only makes you compliant with the new rules, but enables you to be eligible to participate in the launch of the top-level .AU domain that is widely expected to take place in the first half of 2021.

Having inaccurate domain information could lead to suspension or cancellation of domain names by domain registries. Organizations not only become vulnerable to losing control of their domain names, websites, or applications when their existing domains expire, but also lose the opportunity to lay their claim to the shorter top-level .AU domains.

  1. This article originally published on Digital Brand Insider.

Written by Connie Hon, Domain Product Manager at CSC

Follow CircleID on Twitter

More under: Domain Management, DNS, Domain Names, Brand Protection

Categories: News and Updates

Is There Such a Thing as Technical Internet Governance?

Thu, 2020-09-03 00:32

In ICANN's "President & CEO Goals for Fiscal Year 2021”, Göran Marby went out to make a curious distinction in the document's second stated goal, according to which he intends to "Implement a common strategy for Internet governance (IG) and technical Internet governance (TIG)". Proceeding to state that "we will begin by identifying the most important issues we need to address, followed by an assessment of where and how we can intervene, the venues we should use, and the resources required to be effective".

While it is common to separate technical and content matters within Internet Governance, the term Technical Internet Governance and particularly the acronym TIG are not widespread, finding little usage in relevant IG literature. This raises the question of what Marby intends to accomplish by making that distinction and how the broader community should understand this in terms of ICANN's position in the ecosystem?

According to the same document, the background for this objective is that "over the last few years, we have seen an increase in legislative proposals that affect ICANN's ability to form policies and make decisions. We are also seeing proposals through standardization forums that can have absolute effects on how the Internet is technically operated".

It can be surmised that the situations being referred to here are the EU General Data Protection Regulation (GDPR) and its effects on ICANN's data governance; and the deployment of DNS over HTTPS (DoH, RFC 8484) and its impacts on ICANN's capability to remain the sole authority in the supervision of the root name servers' operation.

While the impact of the GDPR over WHOIS has been widely discussed, DoH (as well as the similar DNS over TLS) is a potentially more relevant question that has been lurking in the background without any process being started by the organization to formally discuss or interact with it. Under DoH, authoritative name servers are not queried directly, with DNS queries being sent and receiving responses over HTTP, obfuscating the user's queries. This feature is rapidly moving towards becoming enabled by default in most, if not all, browsers.

While this has very clear privacy and security advantages, it also means that the DoH provider becomes the de facto gatekeeper of the DNS, being able to override decisions made by ICANN and maintain a spin-off version of the DNS if they so desire. I, as well as other researchers have argued that this direct control, this policy through gatekeeping, is exactly what sets ICANN apart from other institutions within this ecosystem and gives it teeth. Losing grip over that final authority would mean a significant loss of enforcement capability.

With such concerns looming over the horizon, Marby's second goal seems clearer, albeit the reaction is quite belated. Both situations have already slipped past ICANN's influence sphere and all that is left for the community is to react rather than preempt, a posture that has been required for the past several years. It makes a lot of sense to look towards the future and anticipate issues, but this should have been a top concern in the least since the IANA stewardship transition.

This distinction between IG and TIG could be derived from the fact that the organization's strict stated mission grows blurrier by the day, whether it wants that to happen or not. The "DNS Abuse Framework”, an industry-organized initiative that started in 2019, already includes clear cases where specific webpage content should result in a takedown from the DNS, such as is the case with human trafficking. With the increase in pressure for Trusted Notifier programs, this tendency is bound to only grow.

ICANN exerts a disproportionate amount of influence over the Internet in relation to its size exactly due to the fine line it walks between acting as a technical body and, at the same time, having quite a direct impact over what content is reachable over the global network. Other technical bodies have more subtle ways to affect content and policies, such as in the case of HTML5's incorporation of DRM in its standards or even DoH itself, but only ICANN can literally flip a switch and make a website disappear.

Let us not forget that this comes almost a year after ICANN applied to the International Telecommunications Union (ITU) for ITU-D Sector membership, which was already a signal of its intention to have more formal representation within other transnational bodies. This approach, taken as a whole, points towards an admission that ICANN cannot exist in a bubble, needing to be present within both what it considers to be IG and what it considers to be TIG to thrive.

However, the TIG distinction itself puts ICANN in a bind. What is ICANN: IG or TIG? No matter what answer one is inclined to give, there are so many holes that can be poked at each option that the most sensical answer has to be "IG in the traditional sense" or simply that it is both. By making this distinction, ICANN itself might be failing to assert its role within the ecosystem, creating a division that only serves to further complicate its own claim to legitimacy. While in rhetoric, it has to maintain that it is strictly technical in every sense, the reality points towards a more mixed role.

It remains to be seen whether this term will be used consistently going forward or if it is an artifact of this specific document, but it certainly requires a more transparent definition for it to make sense and possibly find some sort of adoption. As it is, it raises more questions than it answers, perhaps even weakening ICANN's position within an area in which everything is inherently interconnected. This, fortunately, or unfortunately, leaves open the inquiry of whether there is such a thing as a Technical Internet Governance.

Written by Mark Datysgeld, Incoming GNSO Councilor at ICANN

Follow CircleID on Twitter

More under: DNS, ICANN, Internet Governance, Policy & Regulation

Categories: News and Updates

First Round of U.S. Layoffs Due to Huawei Blockade

Tue, 2020-09-01 20:33

NeoPhotonics, the Nasdaq-listed producer of various optical communications products, including silicon photonics and photonic integrated circuits (PICs), warned investor this week that the new restrictions on business with China's Huawei — its largest customer — could have a major impact on future sales. Dave Burstein writes to report: NeoPhotonics has lost $40M in quarterly sales to Huawei, a brutal blow to a company with quarterly sales of ~$100 million. They promise 'appropriate expense adjustments and structural actions to mitigate the impact of revenue declines.' That almost definitely means they will fire a lot of people. Huawei makes most of its own optical parts and will probably soon develop an alternative. I suspect Huawei's purchases were especially high as they stock up to withstand the U.S. blockade.

Follow CircleID on Twitter

More under: Internet Governance, Mobile Internet, Policy & Regulation, Telecom

Categories: News and Updates

ICANN Introduces Pandemic Internet Access Reimbursement Program

Tue, 2020-09-01 20:05

The Internet Corporation for Assigned Names and Numbers (ICANN) on Monday announced its Pandemic Internet Access Program Pilot for the upcoming ICANN69 meetings. The pilot program will offer community members with limited Internet capacity financial assistance to increase their Internet bandwidth during this remote public meeting. ICANN says the services must be rendered by a business entity or Internet service provider, specifically for Internet or cellular data service. Interested community members must apply in advance of ICANN69. The application deadline is Friday, 2 October 2020, at 23:59 Coordinated Universal Time (UTC). The application is available here.

Follow CircleID on Twitter

More under: Broadband, ICANN

Categories: News and Updates

Is Telemedicine Here to Stay?

Tue, 2020-09-01 19:39

It's going to be interesting to see if telemedicine stays after the end of the pandemic. In the past months, telemedicine visits have skyrocketed. During March and April, telemedicine billings were almost $4 billion, compared to only $60 million for the same months a year earlier.

As soon as Medicare and other insurance plans agreed to cover telemedicine, a lot of doctors insisted on remote visits during the first few months of the pandemic. In those early months, we didn't know a lot about the virus, and doctor offices were exercising extreme caution about seeing patients. But now, only four months later, a lot of doctor's offices are back to somewhat normal patient volumes, all done using screening patients at the door for temperature and symptoms.

I had two telemedicine visits during April, and the experience felt familiar since I was spending a lot of my day on Zoom meetings that month. These were zoom-like connections using specialized software to protect patient confidentiality, but with a clearly higher resolution camera (and more bandwidth used) at the doctor's end. I was put on hold waiting for the doctor just as I would have been in the doctor's office. One of my two sessions dropped in the middle when the doctor's office experienced a 'glitch' in bandwidth. That particular doctor office buys broadband from the local cable incumbent, and I wasn't surprised to hear that they were having trouble maintaining multiple simultaneous telemedicine connections. It's the same problem lots of homes were having during the pandemic when multiple family members have been trying to connect to school and work servers at the same time.

One of my two telemedicine sessions was a little less than fully satisfactory. I got an infected finger from digging in the dirt, something many gardeners get occasionally. The visit would have been easier with a live doctor who could have physically looked at my finger. It was not easy trying to define the degree of the problem to the doctor over a computer connection. The second visit was to talk with a doctor about test results, and during the telemedicine visit, I wondered why all such doctor meetings aren't done remotely. It seems unnecessary to march patients through waiting rooms with sick patents to just have a chat with a doctor.

There was a recent article about the topic in Forbes that postulates that the future of telemedicine will be determined by a combination of the acceptance by doctors and insurance companies. Many doctors have now had a taste of the technology. The doctors that saw me said that the technology was so new to them at the time that they hadn't yet formed an opinion of the experience. It also seems likely that the telemedicine platforms in place now will get much feedback from doctors and improve in the next round of software upgrades.

The recent experience will also lead a lot of doctor's offices to look harder at their broadband provider. Like most of us, a doctor's office historically relied a lot more on download speed than upload speed. I think many doctor's offices will find themselves unhappy with cable modem service or DSL broadband that has been satisfactory in the past. Doctor's will join the chorus of those advocating for faster broadband speeds — particularly upload speeds.

Telemedicine also means a change for patients. In the two sessions, the doctor wanted to know my basic metrics — blood pressure, temperature, and oxygen levels. It so happens that we already had the devices at home needed to answer those questions, but I have to think that most households do not.

I don't think anybody is in a position to predict how insurance companies will deal with telemedicine. Most of them now allow it, and some have already expanded the use of telemedicine visits through the end of the year. The Forbes articles suggest that insurance companies might want to compensate doctors at a lower rate for telemedicine visits, and if so, that's probably not a good sign for doctor's continuing the practice.

My prediction is that telemedicine visits will not stay at the current high level, but that they will be here to stay. I think when somebody books a visit to a doctor that they'll be given a telemedicine option when the reason for the visit doesn't require an examination. The big issue that will continue to arise is the number of homes without adequate bandwidth to hold a telemedicine session. We know there are millions of people in rural America who can't make and maintain a secure connection for this purpose. There are likely equal millions in cities that either don't have a home computer or a home broadband connection. And there will be many homes with so-so broadband that will have trouble maintaining a telemedicine connection. Telemedicine is going to lay bare all of our broadband shortcomings.

Written by Doug Dawson, President at CCG Consulting

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: News and Updates

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer