Archive

Archive for the ‘verisign’ Category

Verisign CEO: “I Don’t See Anybody Who’s Going to Abandon The .Com For a New gTLD

February 6th, 2014 Comments off

The CEO and President of Verisign James Bidzos had some pretty interesting comments regarding the new gTLD’s and the effect they may have on .com and .net registrations during its earnings call today.

I have listened to the last several Verisign earnings call and this is the first time I can remember an analyst asking a pointed question about the new gTLD’s and their effect on the .com and .net registry.

The question was asked by JP Morgan Chase & Co:

Any thought to whether the gTLD program, whether people coming in for new sites might actually be thinking about an alternative gTLD for .com? And maybe that’s having some sort of impact?

D. James Bidzos, Founder, Executive Chairman, Chief Executive Officer and President of Verisign:

“”As far as the new gTLD program, I don’t think that, I don’t see anybody who’s going to abandon the .com for a new gTLD.

There’s a lot of strong anecdotal evidence that, that may not be the case.

I can give you an example. So, for example, you may have seen that the U.K. paper, dailymail.co.uk, a very typical English configuration for a web address, using a company that U.K. configuration.

So dailymail.co.uk purchased dailymail.com.

They said that they purchased it because they wanted something that was more global that would allow them to get more traffic, especially in the U.S.

They paid GBP 1 million for it.

They bought it from The Charleston Daily Mail of Charleston, West Virginia, a 100-year-old Pulitzer Prize winning newspaper, who after they got their roughly $1.6 million for that domain name, were free to go out and buy whatever they wanted.

Then they chose to go out and buy CharlestonDailyMail.com.

If they bought it for retail, they probably paid about $10 for it.

So I think .com is very much the preferred, established reliable name.

I don’t know what’s going to happen in the future with the TLDs.

I’m sure some of them will do well, they’ll build some community.

But I guess I can give you one data point. If we look back, this is not the first time this has happened.

There have been some new TLDs before, and one of them that I think actually is a good idea of what success what might look like, a good example would be .co.

And I’m sure a lot of you are familiar with it.

.co is a short name, it’s just 2 letters.…

Categories: External Articles, verisign Tags:

Versign Reports: 127.2 Million .Com/.Net Domains, Revenue up 10%; Has $1.7 Billion In Bank &

February 6th, 2014 Comments off

VeriSign, Inc. (NASDAQ :VRSN ) reported financial results for the fourth quarter of 2013 and year ended Dec. 31, 2013.

VeriSign, Inc. and subsidiaries (“Verisign”) reported revenue of $246 million for the fourth quarter of 2013, up 7% from the same quarter in 2012.

The operating margin was 53% for the fourth quarter of 2013 compared to 58.8% for the same quarter in 2012.

Verisign reported net income of $292 million and diluted earnings per share (EPS) of $1.94 for the fourth quarter of 2013, compared to net income of $106 million and diluted EPS of $0.65 in the same quarter in 2012.

Financial Highlights

Verisign ended the fourth quarter with cash, cash equivalents and marketable securities of $1.7 billion, an increase of $167 million from year-end 2012.

Cash flow from operations was $147 million for the fourth quarter of 2013 and $579 million for the full year 2013 compared with $171 million for the same quarter in 2012 and $538 million for the full year 2012.

Business Highlights

Verisign Registry Services added 1.29 million net new names during the fourth quarter, ending with 127.2 million active domain names in the zone for .com and .net, which represents a 5 percent increase over the zone at the end of the fourth quarter in 2012.

In the fourth quarter, Verisign processed 8.2 million new domain name registrations for .com and .net as compared to 8.0 million for the same period in 2012.

During 2013, Verisign processed 34.0 million new domain name registrations as compared with 33.1 million for 2012.

The final .com and .net renewal rate for the third quarter of 2013 was 72.7 percent compared with 72.5 percent for the same quarter in 2012.

Renewal rates are not fully measurable until 45 days after the end of the quarter. Non-GAAP Items…

Verisign: Rise In Value Of Bitcoin Causes Surge In .Com/.Net Domain Registrations: 22K In 2013

January 23rd, 2014 Comments off

Verisign just published a blog on how the rising value of Bitcoin has led  to surge of .com and .net domain name registrations.

 

The value of a Bitcoin surged from roughly $13 at the beginning of 2013 to an eventual high of $1,137 at the end of November (The chart below, courtesy of Coinbase, documents the ebbs and flows of a Bitcoin’s value).

 

 

 

By doing a search of the word “bitcoin” for the past six months, it is fairly easy to see a correlation between the two Bitcoin price surges shown above and surges in domain name registrations including the term “bitcoin” in the grapher results below.

 

 

It is interesting to see that despite the relative modesty of the April price surge when compared to the November/December surge, daily registrations for both instances peaked at roughly the same number: 490 and 472 respectively. That being said the density of the two domain registration surges in the graph show that significantly more domains were registered as Bitcoin approached its all-time high.

 

Did you register any Bitcoin related domain names in 2013? Are you planning to register some in the new year? Let us know in the comments below!…

Categories: .com, External Articles, verisign Tags:

Verisign: .Com Registrations Blow Past 112 Million Mark For The 1st Time

January 16th, 2014 Comments off

Screen Shot 2014-01-16 at 3.40.05 AM

Verisign is reporting that the number of .Com domain names in the active zone file exceeded the 112,000,000 Mark for the first time.

The number of .com domain name registrations in the active zone crossed the 112 Million number today January 16th, 2014

The number of .net domain names in the active zone file is down slightly but still well over the 15 million domain name mark.

 …

Categories: .com, Domains, External Articles, verisign Tags:

Verisign Is Taking Back Expired Two Letter & Two Number .TV Domains

December 27th, 2013 Comments off

Verisign logo

Verisign the registry for the .TV registry is apparently taking back any expired new Two letter (LL.TV) .TV domain names as well as any two numbered (NN>TV) domain names.

Recently a domainer that had a back order on a two letter .Tv domain and a two numbered domain received this notice from Verisign:

“Thank you for contacting Verisign Support. I reached out to our Operations team to determine where they were in their investigation. They have a resolution for the issue prepared. However, the business team has concluded that both 13.TV and GM.TV will be placed into Reserved status upon their deletion.

They will not be able to be registered by any registrar once that takes effect.

They are reserving these domains at the registry level for use in specialized promotions aimed at generating significant brand awareness and adoption of the .tv tld.

I do apologize for the inconvenience, I know you wanted to register these names.”"

You can now see that the domain names have in fact been taken back from the whois:

Domain Name: 13.TV
Domain ID: 108626603
Updated Date: 2013-12-25T04:04:39Z
Creation Date: 2013-12-25T04:04:39Z
Expiration Date: 2023-12-25T04:04:39Z
Sponsoring Registrar: .TV RESERVED DOMAINS
Sponsoring Registrar IANA ID: 9998
Domain Status: SERVER-UPDATE-PROHIBITED
Name Server: No nameserver
DNSSEC: Unsigned delegation

Could Verisign do the same thing with a two letter expired or a two numbered .net expired domain name?

What would stop them?…

Verisign Publishes Last Part Of Series On Possible Domain Collision: “SLD Blocking Is Too Risky Without TLD Rollback”

November 21st, 2013 Comments off

Verisign Published its final part of its 4 part series on domain collision and the new gTLD program entitled  “SLD Blocking Is Too Risky Without TLD Rollback”on the final day of the ICANN Meeting in Argentina.

Here it is:

“ICANN’s second level domain (SLD) blocking proposal includes a provision that a party may demonstrate that an SLD not in the initial sample set could cause “severe harm,” and that SLD can potentially be blocked for a certain period of time.

The extent to which that provision would need to be exercised remains to be determined.  However, given the concerns outlined in Part 2 and Part 3 of this series, it seems likely that there could be many additions (and deletions!) from the blocked list given the lack of correlation between the DITL data and actual at-risk queries.

If the accumulated risk from non-blocked SLDs were to become too large, it could become necessary for ICANN to withdraw the entire gTLD from the global DNS root.

Changes to the DNS root, once properly approved and authorized, can be implemented rapidly by updating the root zone file and notifying root server operators that a new zone file is available.

This part of the process is as straightforward for deletions as for additions.

The approval and authorization process, however, would need to be much faster for a deletion than it currently is for an addition because of the urgency of making the change or “rollback” after a determination was reached that a gTLD’s delegation needed to be revoked.  The importance of rapid delegation is affirmed in Recommendation 3 of SAC062:  Advisory Concerning the Mitigation of Name Collision Risk, published Nov. 7 by ICANN’s Security and Stability Advisory Committee (SSAC):

Recommendation 3: ICANN should explicitly consider under what circumstances un-delegation of a TLD is the appropriate mitigation for a security or stability issue. In the case where a TLD has an established namespace, ICANN should clearly identify why the risk and harm of the TLD remaining in the root zone is greater than the risk and harm of removing a viable and in-use namespace from the DNS. Finally, ICANN should work in consultation with the community, in particular the root zone management partners, to create additional processes or update existing processes to accommodate the potential need for rapid reversal of the delegation of a TLD.

For similar reasons, the DNS resource record TTLs for a new gTLD needs to be managed carefully to minimize residual effects that may occur should a problematic TLD delegation be removed.…

Categories: External Articles, verisign Tags:

265 Million Domains: We Asked What Happened To The Verisign Report & Tonight Verisign Answered With The Numbers

November 15th, 2013 Comments off

Earlier today we asked what happened to the Verisign Quarterly  Domain Name Industry Brief that has been missing for 2013 and tonight Verisign answered us.

In in a post on the company blog Verisign releasing an series of three infographics showing that current worldwide domain name registrations sit at 265 Million domains as of the end of the 3rd quarter of 2013, up from 252 million reported by Verisign as of December 31, 2012.”

In the blog post Verisign writes:

“Today Verisign announced that we are updating the Domain Name Industry Brief (DNIB) and a new version of the DNIB is expected to be released in the first quarter of 2014.”

“With the Internet continuing to evolve in new ways, we have been evaluating how best to align the DNIB with that evolution so it better addresses the interests of our readers and expands the scope of the trends we’re tracking. ”

“We remain committed to continuing to provide informative content on the latest industry trends that are most relevant to our readers.”

“Along with this announcement, we have also released infographics containing select DNIB data for the first three quarters of 2013

Here are the infographics and we at TheDomains.com appreciate  the quick response

domain-name-industry-brief--q1-2013_528694728c57b_w587

 

domain-name-industry-brief--q2-2013_528694fa1a2a3_w587

 

domain-name-industry-brief--q3-2013_52869583aaec5_w587

 …

What Happened To Verisign’s Quarterly Domain Name Industry Report?

November 15th, 2013 Comments off

Verisign logo

 

Verisign which had been releasing a quarterly report on the domain name industry seems to be  Missing in Action.

The last Domain Name Industry Report was  published for the quarter ending December 31, 2012.

Verisign had been issuing quarterly reports containing the number of total domain names registrations with a break down ccTLD’s has been a no show in 2013.

Verisign’s last quarterly report it issued was for December 2012 showed 252 million domain names registered  worldwide.

As you can see from Verisign’s own archive, since at least  2008 Verisign has issued 4 quarterly reports on the domain industry, however there are no quarterly reports issued in 2013.

We are now in the fourth quarter of 2013 so there are 3 reports missing for 2013.

TheDomains.com reached out to Verisign’s representatives a few weeks ago but did not get an explanation of why they haven’t produced any quarterly reports or when the next report will be issued.

 …

Verisign Publishes 3rd Part of Four Part Series On Name Collisions

November 14th, 2013 Comments off

Verisign publishes the 3rd part of a new 4 part series on name collisions last night.

“Blocking a second level domain (SLD) simply on the basis that it was queried for in a past sample set runs a significant risk of false positives.  SLDs that could have been delegated safely may be excluded on quantitative evidence alone, limiting the value of the new gTLD until the status of the SLD can be proven otherwise.

Similarly, not blocking an SLD on the basis that it was not queried for in a past sample set runs a comparable risk of false negatives.

A better way to deal with the risk is to treat not the symptoms but the underlying problem:  that queries are being made by installed systems (or internal certificates are being employed by them) under the assumption that certain gTLDs won’t be delegated.

A query for an applied-for generic top-level domain (gTLD) provides initial evidence that an installed system may be at risk from name collisions.  ”

“Depending on what data is collected, that evidence may also include one or more SLDs, the IP address of the resolver that sent the query, and other forensic information such as the full query string. ”

“This information can be a good starting point for understanding why an installed system has made certain queries, what could happen if the responses to the queries were changed, and what other queries, not in the particular sample set, could also put the installed system at risk.  A comprehensive analysis requires much more than just a count of the number of queries for a given gTLD and/or SLD.  It also requires a set of measurements such as those described in detail in the New gTLD Security, Stability, Resiliency Update: Exploratory Consumer Impact Analysis, incorporating the context of those queries:

  • Periodicity:  Do the queries repeat at a regular frequency?  This can help determine whether the queries are a result of user browsing, or of an automated process that depends on a certain response.
  • Affinity:  Where are the queries coming from?  Are they correlated with one country?  One network?
  • Impact:  Which network protocol generated the query?  The WPAD, ISATAP and DNS-SD protocols all generate DNS queries in support of internal network configuration that could result in queries to the global DNS.

The analysis in the New gTLD Security, Stability, Resiliency Update: Exploratory Consumer Impact Analysis applied these measurements to produce a qualitative “risk matrix” for applied-for gTLDs including risk vectors based on frequency of occurrence of WPAD, ISATAP, DNS-SD queries, internal name certificates, HTML references, and regional affinities, among other factors (such as queries that appear to be related to McAfee antivirus defenses).…

Categories: External Articles, verisign Tags:

Verisign Publishes 2nd Part of 4 Part Series On Why Domain Blocking Of New gTLD’s Isn’t Good Enough

November 8th, 2013 Comments off

Verisign just released the second in a four part series of why ICANN’s effort to fix the potential collision issue caused by the release of new gTLD’s, by blocking domain registrations, isn’t good enough to fix the potential problems

Here is the second post of the promised four part series:

For several years, DNS-OARC has been collecting DNS query data “from busy and interesting DNS name servers” as part of an annual “Day-in-the-Life” (DITL) effort (an effort originated by CAIDA in 2002) that I discussed in the first blog post in this series.

DNS-OARC currently offers eight such data sets, covering the queries to many but not all of the 13 DNS root servers (and some non-root data) over a two-day period or longer each year from 2006 to present.

With tens of billions of queries, the data sets provide researchers with a broad base of information about how the world is interacting with the global DNS as seen from the perspective of root and other name server operators.

In order for second level domain (SLD) blocking to mitigate the risk of name collisions for a given gTLD, it must be the case that the SLDs associated with at-risk queries occur with sufficient frequency and geographical distribution to be captured in the DITL data sets with high probability.  Because it is a purely quantitative countermeasure, based only on the occurrence of a query, not the context around it, SLD blocking does not offer a model for distinguishing at-risk queries from queries that are not at risk.  Consequently, SLD blocking must make a stronger assumption to be effective:  that any queries involving a given SLD occur with sufficient frequency and geographical distribution to be captured with high probability.

Put another way, the DITL data set – limited in time to an annual two-day period and in space to the name servers that participate in the DITL study – offers only a sample of the queries from installed systems, not statistically significant evidence of their behavior and of which at-risk queries are actually occurring.

A concrete example will illustrate the point.  Continuing the observations and analysis started in New gTLD Security and Stability Considerations and New gTLD Security, Stability, Resiliency Update: Exploratory Consumer Impact Analysis, Verisign Labs analyzed the SLDs in queries for 14 applied-for gTLDs at the A and J root servers during the period from July 16 to October 19, 2013.…

Categories: External Articles, new gTLDs, verisign Tags: