In the Matter of ) ) IMPROVEMENT OF TECHNICAL MANAGEMENT ) Docket No. 980212036-8036-01 OF INTERNET NAMES AND ADDRESSES; ) PROPOSED RULE )
gTLD-MoU Policy Advisory Body Kent Crispin, Chair PO Box 21117 Castro Valley, CA USA 19-Mar-1998
In the Matter of ) ) IMPROVEMENT OF TECHNICAL MANAGEMENT ) Docket No. 980212036-8036-01 OF INTERNET NAMES AND ADDRESSES; ) PROPOSED RULE )
The gTLD-MoU Policy Advisory Body (PAB) is a loose-knit organization composed of the signatories of the gTLD-MoU, and created under the auspices of the gTLD-MoU, along with the Policy Oversight Committee (POC) and the Council of Registrars (CORE).
About 170 organizations from the following countries and regions are represented in PAB: Albania, American Samoa, Austria, Australia, Bahamas, Belgium, Canada, Channel Islands, China, Czech Republic, Denmark, France, Germany, Ghana, Guadeloupe FWI, Hong Kong, Indonesia, Ireland, Israel, Italy, Japan, Korea, Luxembourg, Mexico, Monaco, Netherlands, New Zealand, Norway, Oman, Philippines, Romania, Singapore, Spain, Sweden, Switzerland, Thailand, Togo, UK, and the USA. (The rest of the 219 signatories of the gTLD MoU elect not to participate in PAB.) PAB members represent a very broad range of interests: very small sole-proprietorships, giant trade organizations, large and small corporations, international organizations, public service and non-profit interest groups, and others.
The PAB exists to provide broad input into the gTLD MoU structure. It is easy to become a member: there are no dues or fees, and the signature on the MoU can include reservations or qualifications on agreement. It has no legally mandated internal structure, and instead, organizes itself and operates according to the Internet "rough consensus" model. The POC is constrained through the gTLD-MoU to seek the consensus of PAB on certain matters.
These comments represent the rough consensus of the PAB.
They also represent the accumulated experience of individuals who have spent a great deal of time thinking about management and governance of the domain name system. While the people who produced the GP have certainly invested a great deal of work and thought in their proposal, many members of the PAB have been involved in these issues for a far longer time, and in sum represent a far greater investment of time and energy in their consideration. It is true that some of the members of PAB have a vested interest in the outcome of this process. But this is a considered and informed vested interest.
Furthermore, it is also important to note that the concerns of PAB are not congruent with those of the POC or of CORE. On the contrary, the PAB exists to countervail against CORE and to keep POC responsive to the wishes, wants, and desires of the Internet Community.
The rationale for the GP is "to privatize, increase competition in, and promote international participation in the domain name system." These are laudable goals. However, the Internet community has been quite successful at accomplishing them without government intervention. In particular, the work of privatizing and increasing competitiveness in the Domain Name System has been a long effort of over 4 years, and the only serious remaining vestige is the US Government (USG) contract with Network Solutions, Inc. (NSI), due to expire shortly: here the USG needs to act, whereas in all other areas the Internet is evolving its own solutions.
International participation in this effort has been strongly evident for a long time. In large part, this reflects the growing internationalization of the Internet; by some counts, a substantial majority of new users are to be found in Europe, and many expect a raw majority of users, content providers, and Internet-based electronic commerce to be found outside the territory of the United States before the start of the twenty-first century.
The GP proposes US Government intervention in the Internet in two areas:
Though it has received government funding since its inception, IANA has run without government involvement or intervention for many years. This is well-known community knowledge: From RFC1601 "Charter of the Internet Architecture Board (IAB)":
"(d) RFC Series and IANA
"The IAB is responsible for editorial management and publication of the Request for Comments (RFC) document series, and for administration of the various Internet assigned numbers."
The date on this RFC is 1994. IANA and IAB have been operating under it since that time, with the full support and understanding of the US Government.
This US Government knowledge is also reflected the NSI cooperative agreement, signed by NSF, in late 1992 (Section 3C):
"C. The Awardee shall provide registration services in accordance with the provisions of RFC 1174. As stated in RFC 1174:
"[T]he Internet system has employed a central Internal Assigned Numbers Authority (IANA) for the allocation and assignment of various numeric identifiers needed for the operation of the Internet. The IANA function is currently performed by the University of Southern California's Information Sciences Institute. The IANA has the discretionary authority to delegate portions of this responsibility..." [Emphasis added.]
Note that the "discretionary authority" mentioned above clearly includes the ability to delegate new TLDs, since many have been delegated by IANA over the intervening years. Now, the USG has restrained IANA from adding the 7 gTLDs defined in the IAHC plan, and the GP proposes to install "up to 5" gTLDs, essentially under direct USG authority.
These efforts to restrict IANA's authority to delegate TLDs have been seen by some as an attempt, not to "privatize and internationalize" the domain name system, but instead to recapture the very portions of the DNS which have functioned well in the private sector for years. And in fact, many observers fear that the reason for this sudden interest in IANA on the part of the USG is just IANA's unwillingness to administer the root in a manner that preserves the very strong economic advantage of a select private group, with long-standing ties to the United States Government.
Prior to the production of the GP, and in keeping with its independent, fair, impartial, and responsible reputation, IANA embarked on a project to develop a new structure that would insulate it from undue pressure, but at the same time make it responsive to the community. The rising tide of international concern in response to the GP should make it clear that IANA must continue this activity on its own, without the guiding hand of the USG (regardless of how well intentioned).
Here the USG could undo years of problematic oversight: the NSI contract should terminate cleanly, with no special privileges for NSI. In this process, NSI should be treated fairly, and not be placed at a disadvantage. However, it must be monitored closely in much the same manner that any broken-up monopoly would be. PAB believes simplest and most straightforward way to accomplish this would be for NSI to become one of the CORE registrars, and for the registration databases for .com, .net, and .org to be transferred to IANA and thence to CORE, over a reasonable transition period.
In summary, therefore, the intervention of the US Government in the Internet is only justified in one area, ramping down the NSI contract. This one potentially useful service is, unfortunately, badly handled in the Green Paper: instead of a graceful ramp-down, the GP provides (as discussed below) a permanent monopoly for NSI. (The GP provides for regulation of NSI during the transition period, but after that no controls are evident).
While the GP proposes intervention in the international Internet in these two areas, it passes by almost without comment the one area of the Internet where the US Government has undisputed authority, the management of the .us ccTLD, and leaves it for a later paper.
Many observers feel that in large measure the problems with the gTLDs stem from the under utilization of the .us ccTLD. The US Government has authority to assign control of the .us domain to whatever entity it chooses (perhaps NSI); to regulated that entity in any way it sees fit; and to conduct any experiments it wishes. This is an area where the GP could be vastly expanded, and garner public acclaim in so doing.
This issue is of fundamental importance, and we discuss it in some detail.
The GP attempts to maximize competition by proposing competing registries as well as competing registrars. It also advocates a "go-slow" approach of cautious experimentation, and, while taking note of possibilities for non-competitive behavior, expresses the hope that competition between registries will work against monopoly practices.
These are laudable goals, but unfortunately, the proposed result is not consistent with them. Not only does it fail at these goals; it actually works strongly against them. Even worse, it sets the stage for inevitable intervention by government regulators at a future date.
In fairness, however, it must be noted that this failure is quite understandable, since the situation is complex, and has been made even more obscure by a history of bitter debate.
There are two aspects to the GP's proposal that need to be considered: First, it fails its goal of cautious experimentation; and second, it gives insufficient consideration to the anti-competitive forces intrinsic to the domain name system:
By then the new IANA (nIANA) will have been formed and been running independently for some time. It will then have the problem of dealing with several entrenched monopoly registries, with no real power or authority to deal with them.
Two possible scenarios play out from here: First, nIANA might be captured by these large registries, and exist as essentially a powerless puppet. Or, more likely, the US Government will see a need to step in and regulate: After all, nIANA is proposed by the GP to be a US non-profit corporation, subject to US laws, and likely most of the registries will be US corporations as well.
Many are suspicious that this is the intended long-term result.
Thus by adopting only a very limited set of new TLDs, with almost certainly very different meanings, the GP has again negated any validity to the "experiment" of competition.
Running this experiment is not risk-free: If the GP authors have miscalculated in their assessment of the competitive forces involved, the result will be establishment of 5 permanently endowed parallel unregulated monopolies. Some of these will certainly have large, steady cash flows that can be used to fuel lengthy legal battles, to lobby many politicians, and to otherwise work to maintain their monopoly status. The risk, in short, is that of instituting flawed public policy that will take years to repair, if it can be repaired at all.
Stripped to the essentials, the GP position can be paraphrased as follows: "We agree that registries may be monopolies and may engage in unfair and uncompetitive practices, but maybe they won't. We don't understand it fully so we will do an experiment. We ignore the issue of how things will be cleaned up if the experiment fails."
The strategy of putting questions off to an "experiment" allows the GP to gloss over the issue of possible anti-competitive practices on the part of for-profit registries. But this issue is the crux of the entire DNS governance problem. The essential nature of DNS is the delegation of natural monopolies -- every domain, at whatever level, has total control over its subdomains. There is no question of this: it is a feature intrinsic to the design of DNS.
So, far more important than the flawed design of the experiment is the fact that GP clearly gives short shrift to its consideration of the competitive environment that would exist between for-profit registries. In this area there are several serious oversights on the part of the GP; here are some of them:
The initial registration of a domain operates under a different competitive model than the ongoing registration renewals. This is a crucial distinction, because most of the opportunities for non-competitive behavior occur in this later phase. Proponents of the "competitive registries" model concentrate on the first phase, because indeed, if you concentrate only on initial registrations, there are possibilities for competition. But if you examine the situation for renewals, opportunities for monopoly practices abound.
A registration is typically for 1 year. Therefore, many of the monopoly practices of concern are relatively long term problems, and won't develop until at least a year after unregulated monopolies start.
The GP proposes to grant the right to "market" 5 additional gTLDs. This is, therefore, a government grant of a commodity to market. In order for this grant to be fair and competitive, therefore, the awardees should be given items of approximately equal value -- in radio spectrum auctions, for example, one slice of the spectrum up for bid is much like any other such slice.
But gTLD names are not like radio spectrum. In fact, each gTLD name is unique, and it is completely obvious that some are much more valuable than others: ".shop" is clearly more valuable than, say, ".zj5".
So the government giveaway proposed by the GP is akin to dispensing individual real estate franchises, some located in downtown Manhattan and some located in the middle of the Nevada desert. There is no obviously fair way to do this.
But, once again, the GP ignores such concerns: The initial configuration proposed by the GP gives NSI a registry with three popular and established TLDs (.com, .net, and .org), a very large steady revenue stream, and an already existing infrastructure, generously paid for by the US Government. NSI's "competitors" are to be "up to 5" registries with names of unknown popularity, with no initial income, and having to invest in the development of their own infrastructure. This is an environment calculated to cause consolidation, rather than competition.
This indicates a serious misunderstanding of how domain names are used. In particular, it ignores the fact that domain names are components of URL's and thus are embedded in web links. These links appear in home pages, in "related information" links, in pages that are collections of information on particular topics, in browser "bookmark" lists, in search engines, and so on.
A popular site may be linked to from hundreds of thousands of other sites. For example, an Altavista search finds 160,000 web pages with links to "amazon.com", and 850,000 pages with links to "yahoo.com". [Note that this result is quite conservative, and does not include pages that Altavista has not searched, browser bookmarks, or other search engines.]
If Yahoo moved to from "yahoo.com" to "yahoo.net" those 850,000 links would suddenly cease functioning, and people using those 850,000 links to click through to Yahoo would suddenly find that Yahoo was gone.
In the particular case of Amazon, that company has recently invested in a very expensive ad campaign publicizing its "amazon.com" name. A change to "amazon.shop", for example, would turn that investment to waste, and incur the same cost over again.
Parenthetically, this example clearly indicates that it is not marketing on the part of Network Solutions, Inc, that is making .com popular. Quite exactly the reverse: what is making .com popular is the collective effort of companies like Amazon and Yahoo. NSI is getting free use of all that effort and expense on the part of thousands of companies, and happily claiming that it is the company that "brought us .com".
Amazon and Yahoo didn't create all those hundreds of thousands of links, and they have little control over them. Those links were created by individuals and webmasters all over the world who saw value in linking to Yahoo and Amazon. Thus, every one of the links represents goodwill, and therefore, positive economic value.
This is even more true with small organizations that may only have a few hundred or a few thousand links -- the links represent the focused attention of interested parties. For a business they represent the continued attention of interested customers, and, therefore, income.
It takes substantial time to build up a body of links like this, and the market value is well known. The first step in marketing a new web site is to submit the URL to various search engines. It takes weeks or months for the search engines to be primed, and much longer for webmasters to build pages with links, and for users to bookmark the URL.
So, there is absolutely no question that the "lock-in" effect is real and substantial, and there is no question, therefore, that an unregulated registry has the ability to gouge higher prices from established customers. NSI could raise Yahoo's rate by a factor of 20, or more, and there would be absolutely nothing that Yahoo could do but pay.
The GP concedes that there is a possibility of monopolistic or anti-competitive practices with for-profit registries, but essentially ignores the legal problem of dealing with these practices if they occur.
There is no question that some kind of regulation is necessary. For example, the GP states "Registries will set standards for registrars with which they wish to do business." Without some kind of regulatory oversight this would allow NSI's registry to set conditions especially favorable to its WorldNIC registrar. Such a special relationship would be very difficult to control, even with strong regulatory oversight. Without oversight it would almost certainly translate into de-facto vertical integration. This vertical integration would likely translate into horizontal domination of the market for domain name registrations.
That is, absent regulation NSI could clearly leverage its control of the .com/.net/.org registry into the market for second-level domain registration.
The GP states that registries should treat "all registrars on a nondiscriminatory basis, with respect to pricing , access, and rules". But without some regulatory mechanism for enforcement these are just words. It appears that perhaps the GP authors have assumed that all registries will be subject to US anti-trust law. If that is the assumption, then it clearly follows that the GP intends de facto US control of this vital component of the Internet. If that is not the intention, then it is clear that the authors have not come to grips with the international character of the Internet, and in particular the fact that the Internet crosses jurisdictional boundaries.
Another possible form of regulation resides in the relationship between the new IANA and registries. Clearly, nIANA will be setting at least some policies for registries, and therefore there is the issue of how these policies are enforced. What are the sanctions that nIANA could levy against NSI's .com registry, for example? Certainly nIANA cannot remove .com from the root domain, or undertake any other action that would harm the end users of the DNS. It is possible that nIANA could bring suit against an errant registry, but nIANA is described as a non-profit corporation, and may find it difficult to engage in a legal battle with a registrar with an income stream in the $100,000,000 range.
Since the GP did not address this issue, we can only speculate as to what the authors might have in mind. However, one approach would be a standard "registry operators contract" that grants the registry operator the right to operate the registry, but not ownership of it: ownership of the registry -- all the data and all intellectual property rights to the TLD name -- would remain with nIANA, with a requirement that the registry operator escrow the data, in case the registry operator needed to be replaced.
If this contract were well crafted this could be a sensible approach, giving nIANA the necessary legal leverage with which to control registries. However, a standard "registry operators contract" would have to be written in the context of the laws of some country. We can only imagine what country the US Government would suggest.
One thing that does seem clear: if nIANA is to have any hope of enforcing policies, it must at a minimum assert control over all the registry data, and any intellectual property rights associated with a gTLD. Otherwise it has no hope whatsoever of enforcement of standards or policies for registries.
However, it is important to remember that putting such control in the hands of nIANA is also fraught with possibilities for abuse. The important lesson here is that the control and regulation issues are truly deep and complex, and the GP has, deliberately or accidentally, completely ignored them.
According to the definitions of registry and registrar given in the GP, registries deal only with registrars, not with end customers. That is, a registry is fundamentally a back-office database operation that has no dealings with end users. A registry operator is not even a wholesale outlet providing goods: it is a service company, providing an almost totally automated service, a service involving a single distinguishing characteristic (a TLD name) which by definition cannot change.
A registrar, on the other hand, handles essentially all the interactions with customers. Registrars can be small and serve local areas -- a gTLD registry must serve the world. Registrars can cater to different market segments -- registries are blind to market segments. Registrars can offer different service levels -- registries offer only an interface to registrars, and most certainly should not favor one registrar over another. Thus, structurally, registries have little opportunity to compete, while registrars have multiple dimensions for differentiation.
In the long run the competition between registrars may make the profitability of a registrar as a stand-alone business very problematic. Likely, in the long run there will be thousands of registrars, and most of them will do domain registrations as a sideline, not as their main business, because there won't be much profit to be gained. Hence the importance of a standardized registry-registrar interface, standardized billing relationships, and so on.
These economic forces and structural characteristics will tend to homogenize the policies of gTLD registries.
We have experience demonstrating homogenization of gTLD policies: in theory the TLDs .com, .net, and .org are by policy supposed to serve different segments of customer base, and originally some attempt was made to enforce this policy.
However, such enforcement proved impractical, even when the registrar and the registry were the same entity.
Enforcement is much more complex when the registry and the registrar are split. Itis obvious that the only practical place where TLD policy can be enforced is in the registrar, not in the registry: the registry, as a back office operation, is simply not in the position to check the credentials of the end user, who may be anywhere in the world.
Even more, there is strong incentive not to check such things. Checking costs money, and every registrant that fails a check doesn't pay the registration fee. So, with the GP for-profit registries there are strong economic forces to accept every registration, and these forces will cause all such registries to quickly move to an essentially uniform set of policies, policies which basically accept every registration blindly.
In sum, it appears that there is confusion concerning the roles of gTLD registry and registrar. Registries do not interact with end customers; registrars do. This split has important and subtle implications that should not be overlooked.
The GP states:
"Competing TLDs would seek to heighten their efficiency, lower their prices, and provide additional value-added services."
In fact, there isn't much efficiency that can be gained. Registries by themselves are very simple operations: they merely maintain names in databases. The registry operation at NSI is a small part of its business.
And any efficiency that would be gained would be more than lost through the fact that the registrars (which would be far more numerous than registries) would all have to be dealing with different interfaces. This would recall the unsatisfactory situation during the dawn of telephone communications, when the multiple competing providers installed incompatible systems: anyone who wished to speak to a user was required to support the termination equipment and cabling needed to reach that user.
The CORE model is far more efficient at reducing prices -- in that case the registrars collectively run a registry on a cost-recovery basis. This registry maintains uniform policies, established by POC and implemented by CORE, with the actual operation of the database is contracted out to a separate organization (at present Emergent, Inc). This contract is let on a competitive basis, and in fact, the system is designed so that multiple contractors can be utilized. This arrangement allows CORE to seek the lowest cost operation, while keeping interfaces and policies consistent. At the same time, the non-profit nature of the registry removes the problems of monopoly pricing that would otherwise plague the Internet community.
The GP says:
"Investments in registries could be recouped through branding and marketing."
This is a total waste for consumers, of course. The way that registries will recoup their investment is through recharge to registrars, which will in turn be passed directly through to the consumers. If there is no demand for a particular TLD to begin with it is pure overhead to consumers to try to create that demand.
And, as described above, it is just unreasonable for a back-office service organization to be doing "branding and marketing." Branding and marketing will and should be done by registrars. WorldNIC is what customers will see, not the small subsidiary company that runs the database, and WorldNIC will be registering domains in all gTLDs. It will quickly become apparent to WorldNIC that there is no particular advantage to being tied to .com.
In any case, most consumers will have no idea what their registrar is doing to register a name: currently most domain names are registered through the intermediary of an ISP, and the customer barely knows the InterNIC exists.
The GP says:
"The efficiency, convenience, and service levels associated with the assignment of names could ultimately differ from one TLD registry to another."
If nIANA is doing its job properly, and establishing enforceable standards for registries, there will be no significant difference in service levels. A registry is not a complicated operation: fundamentally it is a rather simple database. But even more important, customers simply won't see any difference in service levels that may exist. They will still be going through their ISP (who may now be registrars).
This is not theory; this is already proven in practice: there already are many TLD registries, with vastly different service levels, and companies have been formed to register names in all of them. In practice, experience shows that, except at the very low end (free or nearly free domains) variation in cost and service level has little to do with what domain a customer chooses.
In summary, then, it is clear that the GP authors have ignored the most basic problem in DNS governance: DNS is intrinsically monopolistic. Any successful plan for managing DNS simply cannot ignore this fact.
The Green Paper claims to be a consensus building process, yet the IAHC plan is not mentioned at all. It is inconceivable that any attempt to deal with the issue of top-level domains would ignore an effort such as that embodied in the MoU. Support for the MoU is not imaginary; it is real. In ignoring the MoU, the GP also ignores:
It is true that the MoU has been contentious. To some extent this reflects the fact that all debate on the Internet tends to the fractious. However, the GP authors should now be abundantly aware that any proposal that approaches reality will be contentious: the unfortunate fact is that there are true competing interests in this arena, and any plan will leave some parties unhappy.
Some of the voices expressing unhappiness with the gTLD-MoU have been very loud. But volume is not an appropriate measure of legitimacy.
By any reasonable measure the IAHC process is legitimate:
Note that the GP, while ignoring the MoU, at the same time claims that IANA is a US Government sponsored activity. If so, then the GP reverses established USG efforts, since IANA authorized and approved the IAHC work. Further, a senior staff member of NSF/FNC participated in the IAHC, so that the work clearly is based on a USG expectation of that work providing continuity to the administration of gTLDs. The GP effectively throws out the year's worth of effort and investment made under that authority.
The GP repudiation of IANA's authority to delegate to the IAHC, and its substitution of a plan that clearly is much more favorable to US interests, also undermines the environment of authority, trust, and stability that IANA has built up over the years. This is very serious, because many international observers believe that if IANA's efforts can be overturned by the actions of invisible functionaries in the US Government, then IANA no longer represents a stable point of reference for the root zone.
This is amply demonstrated by the strongly negative European Union response, and the strong disapproval coming from the Australian and French governments. However, this negative response was predictable from the following features of the GP:
But there is a more fundamental problem: any unilateral action of the USG, like the GP, has an intrinsic risk of being seen as an attempt by US interests to take control of the Internet. The GP authors ignored that risk, with predictable results. As a result, the international credibility of the entire USG effort has been seriously undermined.
The consequences are far more important than the mere embarrassment of the USG: by moving the issue to the arena of international diplomacy the USG has undermined two of it's own basic goals: stability, and the privatization of the Internet.
The worst possible result, fracturing of the DNS root zone, is now openly discussed as a result preferable to US hegemony over the Internet. The Internet presents the United States and its enterprises with an opportunity to compete in a seamless international electronic marketplace, and US enterprises are at present very well positioned to take advantage of this opportunity. Establishment of European or Australian root servers, or similar protectionist measures, would be of great harm to these American interests. The Green Paper, with its subtle claim of US control, challenges the rest of the world to create its own infrastructure. It is foolhardy and reckless to believe that the world does not have the means or the will to do so.
Internationally, there are only two viable options for any kind of Internet governance: either a plan that springs authentically from the private sector, independent of any government, or a plan hammered out through an international treaty process. The later approach would take years.
Of course, a coherent, widely supported plan has already come from the private sector: the IAHC plan. The GP can regain international credibility by recognizing the IAHC plan, and offering to work with POC/CORE/PAB to implement it. Any other approach runs a serious risk of aggravating the suspicions already rampant.
Currently all the extant gTLDs are under the control of NSI's dispute resolution policy. While trademark attorneys and the public alike almost uniformly criticize this policy, it does have the virtue of uniformity.
The GP proposes multiple unregulated monopoly registries, each with its own dispute resolution policy. From the point of view of those trying to defend their trademarks this is pure madness: instead of one offensive policy they have multiple policies, in multiple jurisdictions. And because of the widespread dissatisfaction with what NSI dispute resolution process, different standards are certain to be developed.
As the NSI experience shows, companies are loath to disturb their dispute policies once they are in place. The probable result of the GP plan, therefore, is the permanent entrenchment of conflicting policies over all the gTLDs.
The GP devotes a large appendix to a description of technical requirements for registries and registrars.
It is certainly true that registries and registrars should be held to standards. But this appendix raises several troubling questions: 1) whence comes the competence of the GP to define such requirements; 2) whence comes the authority to enforce these standards; 3) what process did the GP authors go through to arrive at the rather specific technical standards in the GP; and finally, 4) why is a policy document defining technical standards?
POC/CORE enlisted a set of volunteer experts to produce the CORE RFP, which was further reviewed by technical experts from AT&T and elsewhere. In contrast, the GP includes description of technical standards for registries and registrars that has gone through no such review.
It appears that some of the GP requirements were pulled from the CORE RFP, some from the IAHC report, and some from elsewhere. While the choice of sources for this pastiche might be flattering to some, the requirements in the CORE RFP and the IAHC report were generated with a particular problem domain in mind, and there is an internal consistency to the results.
The GP requirements, not having gone through this level of review and development, do not have this internal consistency. This lack of depth is evident, for example, when the GP speaks of requiring "encryption and authentication". The CORE RFP carefully considered the issue of encryption in a worldwide context, and concluded that digital signatures for authentication would be used, but encryption would not be used. This is because the use of encryption is encumbered in many countries, whereas the use of digital authentication is not. The GP ignores this crucial distinction. [Of course, there remains the nagging possibility that the GP authors assume all registries and registrars will actually operate under the US legal system.]
Also, The GP technical requirements for Registrars are not just over-detailed, they are excessive, and serve no purpose except to increase the cost of registration, with no real benefit to the consumer. For example, there is absolutely no need for registrars to have dual connectivity to the net, and in some areas of the world this may be extremely difficult and expensive to obtain. It is useful to remember that the highly successful Nominet shared registry system in Great Britain only requires email access for registrars.
However, the point is not a particular flaw in the requirements. Rather, the point is that technical requirements should be developed through a technical process, with technical people involved, as they were for the CORE SRS. Such requirements simply do not belong in a document purporting to describe policy, such as the Green Paper.
The Green Paper as currently constituted is a stealth document. It speaks of privatization, but ignores the most prominent private effort already dealing with the same problems. It speaks of less government regulation, but sets the stage for extensive government regulation at a later date. It speaks of internationalization, but in fact firmly entrenches the central point of the Internet infrastructure in the United States. It speaks of competition, but in fact continues the NSI monopoly, and seeks to establish new monopolies. It is vague when it comes to important matters, like how regulation will actually be done; and it is very specific where it shouldn't be, as in the specification of technical standards.
The Green Paper has already sown the seeds of international distrust, and started us down the road of endless intergovernmental bickering over who controls what in the Internet. Without very substantive changes, then, the inevitable result will be a substantial delay in enacting any policy whatsoever, and the ultimate failure of the NTIA effort.
Therefore, the best course of action is for the GP to narrow its focus; to concentrate on the problem of how NSI can best be transitioned from a privileged government sanctioned monopoly to one player in a field of many; to create a comprehensive plan for the .US ccTLD; and to cooperate with POC, CORE and IANA to evolve a true private solution to the gTLD governance problem.