As expected, AT&T and T-Mobile USA appear to be close to scrapping their proposed merger, having withdrawn their application to the Federal Communications Commission.
Deutsche Telekom, the parent of T-Mobile, and AT&T said in a joint statement that they intend to pursue the $39 billion merger and will prepare for a federal antitrust lawsuit that is seeking to block the deal. But the companies also said that AT&T intends to take a $4 billion charge to earnings to reflect the potential breakup fees that AT&T must pay to Deutsche Telekom if the deal does not go through.
The actions followed the decision earlier this week by Julius Genachowski, the F.C.C. chairman, that the merger does not meet the commission’s standard for approval. Mr. Genachowski began circulating to other commissioners a proposed order to refer the case to an administrative law judge, the first step toward a commission move to block the deal.
The application withdrawal appears in part intended to prevent the F.C.C. from making public AT&T and T-Mobile records about the potential effects of the merger. The companies have maintained publicly that the deal would not lessen competition and that it would create jobs in the United States. But those points have been disputed by the Justice Department and the F.C.C., and F.C.C. officials have said that AT&T’s confidential filings indicate the merger would kill jobs.
But even if the FCC approved the deal, the DoJ is suing to block it. AT&T and Deutsche Telekom said they would return to the FCC process if they secured approval from the DoJ first.
In a statement, Deutsche Telekom said that the withdrawal “is being undertaken by both companies to consolidate their strength and to focus their continuing efforts on obtaining antitrust clearance for the transaction from the Department of Justice. As soon as practical, Deutsche Telekom and AT&T intend to seek necessary F.C.C. approval.”
AT&T issued its own statement saying that the companies are taking this step “to facilitate the consideration of all options at the F.C.C.” The company said the $4 billion pretax charge, to be taken in the current quarter, reflects a $3 billion cash payment and $1 billion worth of cellular phone airwaves, or spectrum, that AT&T must pay to T-Mobile’s parent if the merger does not receive regulatory approval.
Analysts said the merger, badly needed by sub-scale T-Mobile USA — the smallest of the four U.S. national mobile operators — looked less likely than ever to succeed.
The Wall Street Journal wrote today:
“AT&T and Deutsche Telekom insisted they weren’t throwing in the towel. Their strategy is to try to strike a settlement with the Justice Department or beat the agency in a trial that begins Feb. 13, then reapply with the FCC for merger approval.
But it was clear that the odds have lengthened significantly for a deal that would have created the country’s largest wireless operator and which represented a huge gamble for AT&T Chief Executive Randall Stephenson.
“We view this as a step towards concession,” Bernstein Research analysts wrote in a note to clients Thursday.
For consumers and for the wireless industry, AT&T’s move raises the prospect of many more months of uncertainty. T-Mobile has been struggling to find its niche, losing 850,000 contract customers this year and failing to land the most sought-after device, Apple Inc.’s iPhone. Even if government officials succeed in blocking AT&T’s purchase of T-Mobile, analysts and investors expect Deutsche Telekom to try to find another way to get out of the U.S. market.
A failure of the deal would also send AT&T back to the drawing board for a strategy to shore up its network and compete with larger rival Verizon Wireless.”
Meanwhile, Sprint -Nextel is likely rejoicing over Thanksgiving with this announcement. The U.S.’s third largest cellular operator has vigorously faught the AT&T – T-Mobile merger on the grounds that it would stifle wireless network competition. AT&T countered that by stating the merger would bring LTE to rural areas using T-Mobile’s existing cell towers there. It doesn’t really seem to matter anymore…….
Happy Thanksgiving to all in the U.S.!
Ethernet over Copper (EoC) technology is rapidly taking over market share from T1, bonded (n x) T1 and T3 private line circuits. Service areas are expanding with EoC now available throughout most of the U.S. EOC is often referred to a “Mid Band Ethernet” to denote a speed from 2Mbps to 100Mbps.
EoC is simply the transport of the IEEE 802.3 Ethernet MAC frame over one or bonded (n x) DSL for the “first/last mile” between the customer premises and the service providers Point of Presence (POP). It’s the same copper twisted pairs used for PSTN voice/fax communications and for high speed Internet access offered by most telcos. For example, 10Mbps EoC uses five such DSL copper pairs between end points. EoC is typically available at 2M, 3M, 4.5M, 5M, 6M, 10M, 15M, 20M bps speeds.
Why copper? Because fiber to commercial buildings is still not widely available in the U.S. According to Vertical Systems Group, fiber access deployment is currently limited to only 11.7% of buildings with more than 20 employees. What about the other 88.3% of business sites where fiber is not available? They are all candidates for EoC services.
EoC pricing is much lower than that of bonded T1 (4 wire) circuits, and it is much easier to upgrade by adding additional copper pairs than conditioned 4 wire T1 circuits. For example, an upgrade from 2Mbps to 10Mbps can be deployed a lot quicker and cheaper than n x T1 or fractional T3. A 3Mbps Ethernet over Copper circuit can cost as little as $ 150 per month, while a 10M bps EoC private line can be as low as $400 per month (reference: EoC provider’s pricing guide).
Of course, pricing for the circuit depends on the end to end distance between EoC end points. If it’s an inter-city connection, then the carrier’s fiber backbone will likely be used to transport the end to end Ethernet MAC frame, with copper (DSL) used for the first and last mile. That first/last mile maximum distance depends on the reach of the DSL deployed (e.g. SHDSL, VDSL, or VDSL2) by the provider. In some cases EoC repeaters/extenders can be used to increase the first/last mile distance between the customer premises and the provider’s CO/POP.
EoC is commonly used as a point- to- point private line service between two business locations. However, it can also be offered as part of a virtual private line (e.g. Frame Relay replacement, with multiple destination end points) or a virtual private LAN (any location- to- any location connectivity). In those cases, the EoC circuit is terminated in the EoC service provider’s central office (CO) and switched to the destination end point, based on the destination Ethernet MAC address. In the case of private line EoC service, the end to end circuit is hard-wired (“nailed up”) within the provider’s CO at subscription time.
EoC was originally part of the IEEE 802.3ah Ethernet First Mile (EFM) standard, which was finished in early 2003 and approved by the IEEE Standards Board in 2004. That standard includes “Ethernet access OAM” which can be used to help diagnose problems. Current EoC implementations can use any type of symmertic DSL (rather than the SHDSL specified in the EFM standard).
Note that “Carrier Ethernet” over fiber is also specified by the IEEE 802.3ah standard. Either one or two fibers can be used to transport 1G Ethernet over Single Mode Fiber (SMF). Of course, lower speeds from say from 100 M bit/sec can also use the 1G Ethernet by simply padding idle characters to reach 1G bit/sec.
Today, EoC is available in many areas of the USA, with XO Communications being one of the leading service providers focused exclusively on business customers. XO currently offers the various types of EoC services at speeds up to 100M bps. They also offer Ethernet over Fiber.
An overview of XO’s Business Ethernet Services is at: http://www.xo.com/services/carrier/transport/Pages/ethernet.aspx
For more information on XO’s EoC and other carrier class services for business customers please contact:
XO Communications Sales Executive
As subscriber usage and advanced applications increasingly shift to Wi-Fi, leading mobile operators are undertaking a range of initiatives to integrate Wi-Fi technology into their mobile data services, according to a major new report from Heavy Reading (www.heavyreading.com), the research division of Light Reading (www.lightreading.com). This initiative seems to go beyond public hotspots into what’s referred to as “managed WiFi” access.
Wi-Fi Strategies for Mobile Operators analyzes the technology advances and ongoing standards work that will allow mobile operators and their customers to take better advantage of Wi-Fi. From a commercial perspective, the report addresses integration of Wi-Fi with the cellular environment, examines to what extent “managed Wi-Fi” should be made part of the end-user service, and explains why Wi-Fi integration should fundamentally be viewed as a platform for service innovation and revenue growth, not simply an offload solution to reduce network costs.
The report provides strategic highlights for 14 major mobile operators that are making substantial investments in Wi-Fi access networks, and examines the positioning of more than 23 leading vendors that are working to help operators influence and enhance the Wi-Fi user experience. The 14 mobile operators are listed here:
“The growth of Wi-Fi has been driven by unlicensed spectrum, standardization and the cost curves that derive from advanced silicon design and manufacturing,” notes Gabriel Brown, Senior Analyst with Heavy Reading and author of the report. “Paired with growth in smartphones, this has created conditions that have literally revolutionized the way in which mobile data services are consumed and how the industry is structured. It was the launch of the Wi-Fi-enabled iPhone in 2007 that signaled the game had changed, and confirmed that local-area wireless technology had made an indelible mark on the cellular industry.”
“With rich-media applications such as Skype, Facetime, BBC iPlayer, Spotify and others being designed to run over Wi-Fi rather than 3G – and in some cases restricted to Wi-Fi because cellular is too congested or expensive – it is clear that users derive value in this form of connectivity that is additive to the 3G wide-area experience. Reclaiming some of that usage and influence is strategically important for operators and is underpinning a renewed push to integrate Wi-Fi more effectively into their subscriber offers.”
Key findings of Wi-Fi Strategies for Mobile Operators include the following:
- High demand from smartphone users and widespread availability make Wi-Fi technology a “game-changer” for the mobile data industry. However, direct opportunities for operators are limited. Low-cost hardware, unlicensed spectrum and link-layer interoperability have propelled Wi-Fi’s global reach, creating tremendous value for smartphone users. But these same growth drivers have resulted in many different types of Wi-Fi that are so diverse in configuration, performance and ownership that it is virtually impossible for mobile operators to develop broad-based Wi-Fi strategies.
- The potential to capture value from Wi-Fi is so compelling that Tier 1 operators in all geographies have committed investment to evaluations and deployments. This is attracting major vendors and technology innovators back to service provider Wi-Fi and forcing it up the priority list at mobile operators worldwide.
- Public access hotspots are the most approachable Wi-Fi opportunity for mobile operators. Operators are attracted to the hotspot market not because it fundamentally changes their economics or value proposition, but because it is an actionable opportunity. Subscribers see clear incremental benefit from having Wi-Fi bundled with their data plans, and operators can ensure reasonably predictable performance and add value for users via capabilities such as auto-login using SIM authentication.
- SIM authentication is the first “proper” step toward Wi-Fi integration with the mobile core, offering security and usability benefits. Together with the Next Generation Hotspot initiative, industry-wide procedures for automatically and securely connecting smartphones to appropriate APs are being developed. This will finally make Wi-Fi a “trusted access network” from the perspective of the mobile core network – a fundamental shift in operators’ approach to Wi-Fi.
Towerstream is doing city wide WiFi in NYC metro area. Please see: Metro WiFi Reborn: City Wide Mega-Hot Spot for Mobile Data Offload
Federal Communications Commission (FCC) Chairman Julius Genachowski expressed concerns about the “spectrum crunch,” which may impede the development of mobile broadband in the U.S. Speaking in Hong Kong at the GSMA Mobile Asia Conference, Mr.Genachowski outlined the FCC’s initiatives to help mobile broadband develop in the U.S. and potentially beyond.
With huge increases in mobile connections, mobile broadband subscriptions and devices putting pressure on networks, Genachowski said the problem of spectrum demand outstripping supply is “the most immediate threat to a successful mobile future.”
“We need to tackle the looming spectrum crunch by dramatically increasing the amount of spectrum available for mobile broadband. The FCC has made recovering spectrum one of our highest priorities,” he said.
The FCC’s national broadband plan includes spectrum recovery goals to return 500MHz of spectrum for mobile broadband in the US by 2020, including proposed two-sided spectrum auctions in which existing spectrum licence holders contribute unused spectrum and take a share of the revenue created by its reallocation.
Genachowski said this plan, under consideration by US lawmakers, could potentially free up 100MHz of spectrum and generate US$25 billion for the US Treasury. “This is the biggest single step we can take to free up the biggest blocks of spectrum,” he said.
Other approaches the FCC is looking at include dynamic spectrum sharing, the creation of a second-hand spectrum market, and offloading data onto Wi-Fi networks to reduce pressure on mobile networks when hotspots are available.
The FCC has made mobility a universal service goal for the first time and is also making it easier for operators and infrastructure companies to provision mobile broadband. “The commission has made it a priority to identify and remove barriers to broadband infrastructure build-out,” Genachowski said. Steps include reducing the cost of attaching wired and wireless equipment to utility poles and allowing spectrum to be used for wireless backhaul, which will help with LTE provision in rural areas.
Genachowski also acknowledged that operators need a meaningful return on their extensive investment in mobile broadband – so the FCC has permitted cellcos to operate tiered pricing that reflect usage levels. All but Sprint have taken advantage of “pay for bandwidth consumed.”
Looking beyond the U.S., Genachowski said the FCC is keen to work with the International Telecommunication Union (ITU) to make progress on issues to do with free movement of data across borders to unleash the potential of cloud computing for mobile. “We must also prioritise and set global targets. We need to all work together to find spectrum globally to tackle the spectrum crunch,” he said.
Also at the GSMA Mobile Asia Congress, Facebook said it expects mobile to be the main source of its next billion users as smartphones become more powerful and the value of adding social tools to devices is realised. “We expect our next billion users will come primarily on mobile,” Facebook VP for partnerships and corporate development Vaughan Smith said this morning in Hong Kong.
“We see people talking about all of the capabilities of the device that are out there. When we look at what we should be talking about, we think that mobile is much more powerful when you add social. And we think the confluence of those two trends are the most important thing going on in technology over the next ten years,” he added.
Facebook currently has 350 million of its 800 million users around the world accessing the service on mobile devices, with twice as many actual visits coming via mobile devices rather than desktops. And mobile is expected to become much more significant in the coming years.
ETSI successfully demonstrated the interoperability of products based on its new M2M standards at the recent ETSI Machine to Machine workshop held in France in October. Five comprehensive demonstrations, organized by ETSI’s Technical Committee for Machine to Machine communications (TC M2M), showcased how the interoperability of standards-based solutions in M2M products is key to market success.
The event, the first in a series of ETSI activities focused on M2M interoperability, included thirteen diverse organizations and covered a wide cross section of M2M applications. These included Smart Energy, Environmental Sensing, mHealth, Intelligent Transport, Ambient Assisted Living, Personal Robots, Home Automation, Medical Appliances and Smart Metering.
The demonstrations covered architectural components specified in the ETSI M2M standard, including M2M devices, gateways with associated interfaces, applications, access technologies as well as M2M Service Capabilities Layer.
The companies involved included Actility, Cinterion Wireless Modules GmbH, Grid2Home, Intecs, Intel, InterDigital, NEC, OFFIS, Radisys, Sensinode, Telecom Italia, Vodafone, Vodafone D2 Test & Innovation Center.
Olivier Hersent, CEO of Actility, commented: “Our contribution to the ETSI M2M demo focuses on interoperability with indoor area networks, such as ZigBee. The demonstration shows how ETSI M2M enables operators and utilities to massively deploy in-the-cloud applications controlling indoor area networks and leveraging a shared infrastructure. ETSI M2M provides the foundation enabling mission critical Internet of Things applications, such as load shifting and demand response.”
Manfred Kube of Cinterion commented: “Standards are the key ingredient for establishing an ecosystem of interoperable health solutions that empower people and organizations to better manage health and wellness. By stepping up our collaborative efforts, we can accelerate the development and implementation of portable, wireless, medical devices that enable a more effective and patient-centric care model.”
James N. Nolan, Executive Vice President of Research and Development at InterDigital, commented: “We are grateful for ETSI’s commitment to driving the standardizing of M2M solutions and promoting interoperability across a wide variety of industries and applications, unlocking the true potential of new services around the world. ETSI’s roadmap is well-aligned with InterDigital’s advanced R&D efforts in M2M communications and our vision of tomorrow’s Internet of Things.”
Dr. Heinrich Stüttgen, Vice President of NEC Laboratories Europe, commented: “M2M is an exploding area for new consumer and business services, providing great opportunities for service operators and enterprises. ETSI M2M standards enable an expanding market and many new opportunities for more intelligent services. At this workshop we showcased heterogeneous M2M devices in an intelligent home with a personal robot and medical devices, interworking to aid the daily live of elderly citizens. Our research activities and our contributions to the ETSI M2M standards as well as our investigations in advanced services areas like Ambient Assisted Living (AAL) or home energy control show NEC’s commitment to advance M2M technology for the benefit of the society.”
Erik Brenneis, Head of M2M, Vodafone commented: “Building on new ETSI specifications, Vodafone and Intel demonstrated a breakthrough ability to use the SIM card to securely deploy new M2M services on remote devices. This opens the door to significant new commercial applications and cost reductions in the operation of M2M services.”
Just this week, Sierra Wireless and IBM formed an M2M Industry Working Group within the open source Eclipse Foundation. That group aims to define and implement an open standard platform for the development tools for creating M2M applications, and IBM, along with partner Eurotech, has donated the sourcecode of its M2M messaging system to be a cornerstone of the effort. IBM hopes to acelerate the standardization of the MQTT protocol, originally developed by IBM and Eurotech in 1999.
However, IBM will not have it all its own way, as some parties think HTTP itself should be the standard protocol for M2M. MQTT, say its supporters, is more optimized for the new networks, particularly because it supports ‘publish and subscribe’ activity, which means the two machines do not need to establish a sustained connection to communicate – very important when networks are unreliable or low bandwidth, as the device can publish when it is ready and then power down.
Observation & Comment:
Lack of standards has slowed down the M2M growth, Here are the M2M standards committees we are aware of:
•European Telecommunications Standards Institute (ETSI) TC M2M
•China‟s sensor network standards working group under the China National Information Technology Standardization Committee
•Telecommunications Industry Association (TIA) TR-50 Smart Device Communications Engineering Committee (United States)
We know of no such comparible M2M standardization effort in the U.S. THe closest is the TIA TR 50 standards committee on Smart Devices, AKA Internet of Thnigs. For more info on that activity, see:
Our concern is that M2M standards are needed within the network for the provisioning, billing, maintenance and management systems for all the new smart devices. Who is doing that work? If no one, then each M2M network provider will build their own purpose built M2M platform.
One of the fastest networks ever built is set to come online this month thanks to $62 million in federal stimulus money. The superfast 100-Gbps Ethernet link will run over a span of dark fiber owned by Level 3 Communications and connect Department of Energy research centers in Illinois, California and Tennessee. The network, which will rely on the Internet2 research consortium for connectivity and routers from Alcatel-Lucent subsidiary LGS Innovations, is designed to enable researchers to conduct more accurate real-world simulations of phenomena such as climate change and particle physics.
“Since 1990, our traffic has grown by a factor of 10 every 47 months on average,” explained Steve Cotter, ESnet department head. “We are in the age of observation right now in science. All of these scientific instruments and experiments are collecting significantly more data than they ever have in the past. This data needs to be stored … and moved around. We’ve been using this 10 Gigabit network … but we knew that if we didn’t start planning now for a 100 Gigabit network, the demand would overwhelm us.”
The 100G ESnet upgrade was funded through the Obama administration’s economic stimulus package in February 2009.
“We received stimulus funds to build this 100G network with the intention of doing two things. One was accelerating the deployment of 100G so that the equipment manufacturers didn’t shelve the technology on fears that there wouldn’t be demand. … The other reason they gave us the money was to build a next-generation network test bed and to fund network research,” Cotter said.
Costing an estimated $62 million, the 100G network will link three DOE research centers: Lawrence Berkeley National Laboratory in California, Argonne National Laboratory in Illinois and Oak Ridge National Laboratory in Tennessee. It also will connect with a key Internet exchange point in New York for connecting to research networks overseas.
Note: IEEE ComSocSCV has had 2 meetings on 100G transmission including 40/100G Ethernet in Oct 2010 and 100G Ethernet/OTN/CEI in July 2011. Presentos may be downloaded free from the Archive section of our web site: www.comsocscv.org