Uncategorized
Video Surveillance and Video Analytics: Technologies whose time has come?
Introduction:
The IEEE ComSoc SCV chapter April 14th, 2010 meeting was very well attended with more than 80 people present. This was our first joint meeting with TiE- The Indus Entrepreneurs organization. The meeting was titled, “Architectures and Applications of Video Surveillance and Video Analytics Systems.” and featured talks plus a panel discussion on those topics
The speakers scheduled to participate in the talks and panel session were Professor Suhas Patil, Chairman and CEO of Cradle Technologies, Basant Khaitan, Co-founder and CEO of Videonetics and Robb Henshaw, VP Marketing & Channels, Proxim Wireless. Robb Henshaw who was scheduled to speak on “A Primer on Wireless Network Architectures and Applications for Video Surveillance” could not attend the meeting due to illness and was replaced for the presentations section of the evening by Alan J Weissberger, IEEE ComSoc SCV chairman. The panel session was moderated by Lu Chang, Vice-Chairman of IEEE ComSoc SCV. This article has been co-authored with Alan J. Weissberger, who contributed the comment and analysis section, raised several unaddressed but nevertheless pertinent questions and also provided references to background articles on video surveillance and video analytics.
Presentation Highlights:
While presenting on behalf of Robb Henshaw on Wireless Network Architectures, Alan J. Weissberger noted that several new technologies are now converging which will make video surveillance a growing market and viable business. These include: higher-quality IP digital cameras, improved and cost-effective video compression technologies (e.g. H.264/MPEG4 and HDTV), fixed broadband point-to-point and point-to-multipoint networks (including fixed WiMAX and proprietary technologies), and mobile broadband (including 3G+, mobile WiMAX and LTE).
To support the claim of a growing market for video surveillance and video analytics, Alan cited several key examples of applications for these technologies such as: security and surveillance applications, emergency and disaster management, asset and community protection by monitoring of buildings and parking lots, public entry/exits, sensitive areas such as ATMs, as well as high-traffic areas like highways, bridges, tunnels, public areas such as parks and walkways, infrastructure like dams and canals and buildings like a cafeteria, halls and libraries. Other applications include securing of sensitive areas like runways and waterways, perimeter security for military installations, remote monitoring of production on factory-floors and tele-medicine/eHealth initiatives.
Alan explained that Proxim believes that HDTV is going to be the technology of choice for video compression because users will be demanding higher quality video images. Furthermore, Proxim thinks that that the wireless communication networks which convey the video streams are best built in a point-to-point and point-to-multipoint topology, rather than (WiFi) mesh which has fallen out of favor. He noted that Proxim’s broadband wireless transport systems that operate over these point-to-point and point-to-multipoint topologies do so over a private network (as opposed to connecting via the Internet like Cradle’s systems do, covered later in this article). Moreover, 95% of Proxim’s installations use fixed broadband wireless (both fixed WiMAX -IEEE 802.16d-2004 and a proprietary technology to increase speed and/or distance) rather than mobile broadband wireless connections.
Alan’s talk elicited two questions from the audience – the first questioner inquired why analog video surveillance technologies have found favor in practice with deployment and why digital video surveillance technologies were placed on the back burner after seeing initial deployment. In his answer, Alan pointed out that digital video surveillance technologies need high-quality digital cameras, but also require a reliable transmission network (wired or wireless) which can provide steady bandwidth to transmit the video surveillance data to a point of aggregation like a central video server. In the absence of sufficient constant bit rate bandwidth, the resulting digital video stream quality will be unacceptable due to jitter or freezing of the image (caused by an empty playback buffer). The lack of sufficient network bandwidth was a major cause of digital video surveillance technologies failing to gain a large market share compared to analog systems. The second question related to the impact of electromagnetic interference (EMI) on the video information. Alan explained that the new wireless broadband communication systems (both WiMAX and LTE) employ a multicarrier modulation scheme such as orthogonal frequency division multiplexing (OFDM) which is fairly resistant to EMI. Furthermore, OFDM can also be combined with multiple input multiple output (MIMO) transmission schemes to minimize the likelihood of errors at the receiver end.
In his talk on “Video Surveillance, Security and Smart Energy Management Systems,” Suhas Patil explained that the recent improvements in semiconductor chip set capabilities and new computer architectures have promoted the growth of digital video surveillance technologies such as those which employ network video recorders to aggregate video from an entire city. He also pointed out that a key contributor to the adoption of video surveillance systems in many parts of the world (despite concerns of invasion of personal privacy) was the possibility of, and actual occurrence of terrorism with the city of London (U.K.) being a pioneer in this regard. Suhas, while describing the structure of the video surveillance system, noted that a critical requirement for these systems is that they be resilient to any fault at anytime. These faults can include network breakdowns, power failures, control room disablement or faults caused by extreme ambient temperatures or extreme climatic conditions. As far as access technologies are concerned, digital video surveillance systems can use WiFi mesh networks, WiMAX networks or proprietary communication systems. According to Suhas, the technology underlying a digital video surveillance system is a highly complex one, employing state-of-the-art hardware such as cameras, storage systems and servers and state-of-the-art software including operating systems, and transmission technologies. Thus, the entire system needs very careful design in order to maximize it’s efficacy. Suhas also briefly explained videoanalytics as a system which can detect an object such as a human being behaving in a manner which would be difficult for a human observer to notice, such as a party guest moving around the room in a random or rapid manner compared to other party guests. Finally, the talk then pointed out several major challenges faced by video surveillance systems including the need to keep abreast of the rapidly changing technologies as well as practical deployment challenges.
In a response to a query on working on still images in a pixel-by-pixel fashion, Suhas explained that it is possible for a basic camera to capture raw information about a scene and then have the data processed pixel by pixel in order to maximize the dynamic range before the image is converted to a JPEG format. During the Q&A, Suhas briefly recounted the history of his company, Cradle Technologies, as being a spinoff from Cirrus Logic and which built multicore processors long before anyone else thought the technology (multicore processors) as being valuable. Additonally, while clarifying his assertion about digital versus analog video surveillance technologies, Suhas noted that while analog technologies are better suited to low-light conditions and offer better dynamic range than their digital counterparts, digital signals allow for better image resolution than analog signals. However, it is difficult to claim one technology as being clearly superior to the other.
The final presentation of the evening, by Basant Khaitan titled, “The Role of Video Analytics in Video Servers and Network Edge Products,” explained the nature of video analytics (VA) as a young field which has also been referred to as video content processing or video content analysis. Basant explained that VA can be defined as the real-time classification and tracking of objects like people and vehicles by using their (objects’) outlines rather than any bodily or facial features. The analytics system can be either co-located with the camera itself (at the network’s edge), or situated at a central server which receives the video streams from the various cameras at the network edges. Additionally, Basant pointed out that while a video frame’s size is of the order of megabytes, the corresponding analytics information for that frame is often no more than a few hundred bytes in size. While explaining the technical details of VA, Basant opined that modern VA systems produce results which are sufficiently reliable for practical use despite the presence of artifacts born from poor ambient light or dust-filled air. Basant then elaborated on a practical VA system built by his company and currently being used by the police in Calcutta, India for traffic management. According to Basant, VA systems built for such purposes as traffic control are highly mission-critical and require 100% reliable operation. In such conditions such as those found in developing countries, VA systems face severe challenges due to the presence of dense populations and poor public compliance with traffic laws. Furthermore, in tropical countries, extreme climatic conditions such as hot weather, rain-flooded streets and dust-filled air can also hamper the quality of the analytics results. In the case of Calcutta, all of the above conditions are faced by the VA system which is being used to control the city’s traffic lights. In this case, network-edge deployed analytics information is sent to a local video server (which, incidentally, was developed in cooperation with Cradle Technologies) from where the information can be remotely retrieved and viewed. Basant pointed out that all several intersections of Calcutta are now monitored by the VA system which has now replaced the previous system which was monitored entirely by human beings.
Panel Discussion Q and A:
The panel session was more of a collaborative Q and A with Suhas Patil and Basant Khaitan. There were many questions from the meeting attendees, several of which could not be entertained due to a shortage of time (see list of unaddressed questions below).
In a response to a question on the number of video surveillance cameras which are connected wirelessly vs wireline connectivity, Basant mentioned that none of the cameras in their deployed systems are connected wirelessly at this time, and that all camera connections are of the wireline type. Suhas noted that while most cameras are connected via a wired connection like a CAT5 cable, the access to the content can be accomplished wirelessly via a cellular service (such as in the Indian scenario) or by a WiFi connection. WiFi-based access to the video server is also available if the server is connected to the Internet (Cradle’s server is one such example). The sensors in the video surveillance network themselves can also be connected wirelessly via a ZigBee mesh network, although they have not yet been so connected.
A question was raised about whether video processing is ready to see any major innovation such as what Map Reduce technology did for Google’s text processing. Suhas responded by explaining that the video analytics for applications like license-plate recognition can be done on the cloud. When queried about how Cradle’s technology can help mitigate the impact of, or altogether prevent future terrorist attacks, Suhas pointed out that the contemporary video surveillance systems had either failed altogether or had malfunctioned during terrorist attacks. Cradle’s system, on the other hand, continually monitors the deployed system for functional integrity via a central server cloud in order to ensure that it (the video surveillance system) is fully operational at all times.
The panelists were then asked a question about what is the preferred mode of connection for a city-wide array of cameras. Suhas invoked the example of Cradle’s approach to digital video surveillance, where fixed broadband wireless access via WiMAX or WiFi mesh is used to connect their networked video server to the Internet. Furthermore, a IP-VPN client (such as a PC or other screen-based device) is connected to the networked video server through the public Internet via a 3G or mobile WiMAX connection. The panelists, in a response to a question regarding the need for video analytics in countries such as those in Asia where labor is abundant and cheap pointed out that since an average human being’s concentrated attention span is only about 10-15 minutes, and the fact that, for the overwhelming majority of the time, nothing significant occurs which warrants raising an alarm, it is imperative that an automated VA system be put in place for such applications as were mentioned earlier in this article.
The final question of the evening inquired on how real-time bandwidth fluctuations within networks such as the mesh networks affect the VA performance. Suhas Patil mentioned that by placing the video server as close to the network drop-off point (i.e., close to the camera) as possible allows for good quality video to be streamed to the server. Thereafter, special network access techniques which circumvent the fluctuating bandwidth can be used to remotely retrieve the video information stored on the server.
After the panel session concluded at 9pm, several attendees stayed on for one on one interaction with the speakers. This continued till 9:15pm when the lights were turned off and we were forced to vacate the auditorium
Unaddressed questions (submitted by Alan Weissberger to ComSocSCV Discussion List):
- Where is video surveillance used now and what are the driving applications?
- Are most of the video surveillance network architectures fixed point to point or point to multi-point, rather than mobile/wireless broadband?
- What role will 3G (EVDO, HSPA), WiMAX (fixed and mobile), and LTE play in delivering video content? Why is mobile broadband required for video client access?
- Are proprietary wireless technologies more cost effective for the performance they offer? Is this a concern for the customer?
- What type of security and network management is being used in video surveillance systems, e.g. for authentication and to prevent intrusion or monitoring?
- What role does video analytics play to augment the potential and power of a video surveillance system? Can it also be used as a stand alone offering?
- Why are IP VPNs needed to convey and deliver the video content? Why not use a dedicated private network instead?
- Is there any intersection between high end video conferencing and video surveillance systems? Are the same cameras, video transport facilities, and network management used for each? What are the key differences?
- What new technologies or business models are necessary for video surveillance to become a really big market?
- What are the current barriers/obstacles to success are the video surveillance and video anaytics markets now experiencing?
- How have terrorist attacks (e.g. Mumbai attack in late 2008) and national disasters (e.g. earthquakes) effected the video surveillance market? What is the opportunity here?
Comment and Analysis (from Alan J Weissberger):
1. Proxim’s answer to the question, Why Video Surveillance? Included these bullet points:
· Perimeter, public monitoring solutions are becoming a key component for enterprises
· Educational, healthcare and financial institutions are beginning to rely on surveillance systems to ensure safety within their premises
· Public safety organizations depend on archived data from video monitoring systems to reduce vandalism in troubled neighborhoods
· Live traffic surveillance is increasingly being used as a tool in community protection
· Terrorist threats and public safety challenges continue to drive the need for high quality remote surveillance and timely response
Additionally, we’d include production plant and factory floor (remote) monitoring to prevent schedule slips and ensure good quality control.
2. The role of broadband wireless networks in stimulating video surveillance:
-Fixed broadband wireless point to point and point to multipont networks and equipment (e.g. Motorola Canopy and Proxim’s products) that replace equivalent topology wireline networks for delivering video over a private network. Both proprietary fixed broadband wireless technology or IEEE 802.16d fixed WiMAX are used.. Those broadband wireless networks cost a fraction of the equivalent wireline networks and can be provisioned in a much shorter timeframe. Fixed WiMAX could also be used to access the broadband Internet in an IP VPN scenario.
-Mobile broadband (3G+, mobile WiMAX, LTE) which adds a whole new dimension to video surveillance and enables many new applications, e.g. IP VPN mobile client observing video images in remote locations, cameras in police cars transmitting video to police HDQ building while moving at high speed, emergency vehicles transmitting videos of natural disasters (hurricances, earhquakes, etc) to 1st responder locations that will deal with the problem(s).
3. It’s important to distinguish between the broadband wireless network architectures and topologies of Proxim (a wireless broadband transmission/ backhaul company) and Cradle (a Networked video server/client solutions company) :
a] Proxim makes broadband wireles transport systems that operate over a pt to pt or pt to multi point PRIVATE network. Those systems backhaul video surveillance and other traffic to one or multiple destinations. Proxim says that 95% of their installations use fixed (rather than mobile) broadband wireless connections.
b] Cradle uses fixed BWA (Wimax or mesh WiFi ) from their Networked Video Server to access the Internet. On the client side, Cradle uses 3G or mobile WiMAX to connect the IP VPN client PC or other screen based device to the Networked Video server through the public Internet. The key issues with that approach is that the end to end IP VPN server to client connection has to be high bandwidth and near constant bit rate, while the client access needs a high bandwidth, steady state mobile broadband connection to observe the MPEG 4 coded video over the IP VPN connection while in motion. Otherwise the video image will be unacceptable or freeze.
4. Basant’s example of controlling Calcutta traffic lights using video analytics integrated with a Networked Video server was a great demonstration of the underlying technology and proof of how valuable it is.
References:
Here are a few background articles on video surveillance and analytics:
Video Surveillance and WiMAX- a great marriage or not meant for each other? Four companies weigh in! (all 3 speaker/panelists+ Sprint were interviewed for this article)
http://www.wimax360.com/profiles/blogs/video-surveillance-and-wimax-a
The Wireless Video Surveillance Opportunity: Why WiMAX is not just for Broadband Wireless Access by Robb Henshaw
http://www.wimax.com/commentary/blog/blog-2009/august-2009/the-wireless-…
Video Surveillance Going Fwd, Suhas Patil, ScD
http://0101.netclime.net/1_5/048/174/0e9/scan0046.pdf
Remote Access Video Surveillance & Analytics, Cradle Technologies
http://cradle.com/about_us.html
INTELLIGENT VIDEO ANALYTICS, a Whitepaper
http://www.videonetics.com/VivaWhitePaper.pdf
Exclusive Interview: Robb Henshaw of Proxim Wireless!http://www.goingwimax.com/exclusive-robb-henshaw-on-proxim-wireless-5857/
Video Surveillance Product Guide
FCC’s National Broadband Plan overview and IEEE ComSoc SCV March 10, 2010 meeting report
Introduction:
The IEEE ComSoc SCV chapter’s March 10th, 2010 meeting featured a very informative talk by William B. Wilhelm Jr., Partner, Telecommunications, Media and Technology Group at Bingham McCutchen LLP titled, “Effects of Broadband Policy and Economic Stimulus on Innovation at the Edge and in the Cloud.” The meeting was chaired by Simon Ma, Secretary, IEEE ComSoc SCV and was attended by approximately 30 chapter members. Despite the relatively low turnout, the number of questions which were raised and discussed during the talk and subsequent Q&A reflected the keen interest amongst the attendees on the broad topic of the Federal Communication Commission’s (FCC) National Broadband (NB) Plan.
Presentation Highlights:
Mr. Wilhelm explained that the FCC (on behalf of the federal government) believes that broadband can form a strong foundation for economic success and has hence drafted the NB plan. The FCC’s primary objective for the NB plan is to spur broadband deployment nationwide through innovation in devices and applications, which in turn, it is hoped, will drive broadband adoption amongst the United States populace. Furthermore, the FCC has designated that the plan must seek “to ensure that all people of the United States have access to broadband capability” and establish benchmarks to meet that goal. In fact, the foregoing statement also delineates the current top internal priority of the FCC. According to Mr. Wilhelm, a data rate of 3 Mbps is regarded as “broadband” within the United States. Underlining the non-trivial nature of the NB plan objectives, Mr. Wilhelm pointed out that several key challenges will need to be overcome to ensure the plan’s success. These challenges include agency and administrative action among the FCC, National Telecommunications and Information Administration (NTIA) and Rural Utilities Service (RUS), legislative action by Congress and a fair competition policy determined by the Federal Trade Commission and protected by the Department of Justice.
Regarding the objective of ensuring that all people of the United States (US) have access to broadband capability, Mr. Wilhelm noted that the American Recovery and Investment Act of 2009 (ARRA) has allocated $7.2 billion in stimulus funds for the expansion of broadband facilities and services to so-called unserved, underserved and rural areas of the country. Additionally, other ARRA-born programs including health care, smart grid and transportation may also promote large-scale broadband adoption. Describing the response to the first round of funding applications, the talk indicated that nearly 2,200 applications were received, requesting a total of $28 billion with $23 billion requisitioned for broadband infrastructure. The presentation also elaborated on the fact that, in addition to the $7.2 billion in stimulus funds for broadband expansion, over $19 billion has been earmarked for Health Information Technology (HIT) including over $16 billion in medical provider incentives for deploying HIT. The aforementioned funding for HIT is aimed towards developing a nationwide health IT infrastructure which allows for electronic storage, transmission and retrieval of healthcare-related information. The talk also provided attendees with a view to the workings of the FCC with regard to new policy generation such as Notice of Inquiry (NOI) release, and the holding of workshops to close gaps in the comments obtained from the NOI release.
Mr. Wilhelm then described the current broadband scenario in the US in terms of deployment, user adoption as well as a qualitative description of the state-of-the-art in hardware and software systems as found in US homes and offices. It was interesting to note that, while the US leads the world in internetworking equipment, semiconductor chipsets, software and internet services and applications, the US suffers from conditions which are fairly unexpected for a country of its economic stature. These latter conditions include the fact that 50-80% of the homes may get broadband speeds which they need from only one service provider, the fact that broadband adoption is lagging in certain customer segments and the fact that deployment costs for various geographies are significantly different. Further elaborating on the shortcomings in the broadband services faced by users in the US, Mr. Wilhelm pointed out that, for the median user during peak hours, actual download speeds are only about half of the advertised speed! Moreover, around 5 million homes get less than the advertised 786 kbps and approximately 35 million homes get less than 10Mbps. Other broadband service drawbacks faced by US-based customers include the fact that several market segments show penetration rates significantly below the 63% average and that the lack of widespread adoption may entail a social cost in the future in terms of lowered access to jobs, education, government services and information. For example, high school and university students who have little to no Internet connectivity will be at a growing disadvantage compared to students who have materially good quality access to the Internet.
Thereupon, the talk pointed out how high-quality broadband connectivity enables innovations across a broad swath of national priorities – for example, health care (electronic health records, telemedicine and remote/mobile monitoring), energy and environment (smart grid, smart home applications and smart transportation), education (STEM, eBooks and content, electronic student data management), government operations (service delivery and efficient administration, transparency in governance and civic engagement), economic opportunity (job creation, job training and placement, and community development) and public safety (next generation 9-1-1, alerts and cybersecurity). On being queried whether retail services are currently the dominant application of broadband communications, Mr. Wilhelm acknowledged the pertinence of the question, but was unable to comment further on the topic since the FCC report had not been released at the time of this talk.
The presentation then delved into topics such as regulation and deregulation of broadband networks, network neutrality, spectrum policy, investment in telecom systems and services, and next-generation 9-1-1 systems. Explaining the significance of internet services like DSL being taken off from under Title II of the Telecommunications Act as a result of deregulation, Mr. Wilhelm pointed out that since the DSL service is no longer under Title II, the FCC cannot protect DSL customers and small DSL companies anymore from being controlled by telcos or network service providers. With regard to net neutrality, the case of Comcast versus the FCC wherein the former is alleging that the Internet was not under the purview of Title II, was briefly touched upon. A question was then posed on whether managed services were expected to crowd out the non-managed services such as best-effort services. An audience member proffered his knowledge that the very same issue is being discussed in the public domain and that no clear consensus has been reached on this topic. On the subject of spectrum policy, the talk reiterated the oft-heard chorus in the telecom circles that the currently allocated spectrum is woefully inadequate to meet projected future demands (especially for the mobile broadband applications). Mr. Wilhelm then elaborated on the need for investment in telecom services and technology since venture capital investments in these sectors has fallen significantly in recent years. According to Mr. Wilhelm, investment in telecom is a key ingredient to promoting innovation across the hardware, software, network and services ecosystem and the absence of strong investment could result in reduced value of services to end-users.
Pointing out that broadband communications can support public safety and homeland security efforts, Mr. Wilhelm then touched upon the prominent areas of public safety which can be improved as a result of a new broadband initiative such as the national broadband plan. These areas are next-generation 9-1-1 systems, cybersecurity, alerting and a nationwide public safety network. For 9-1-1 systems, Mr. Wilhelm suggested the possibility of having an all-IP based system and to also allow users to submit recorded video to the 9-1-1 operators who could then dispatch the user videos to first responders.
Analysis:
The national broadband plan which the FCC will release (which, at the time of the writing of this article, has been released) is a key step in promoting the widespread adoption of broadband connectivity within the US. If a large portion of the US population gains access to broadband communication systems, the US can continue leading the world in technology innovations in telecom hardware, software and services sectors. Indeed, we believe that it is imperative that the FCC’s objectives of widespread broadband adoption be met in order to help meet other national goals such as homeland security, economic opportunity, healthcare and education. However, as was pointed out by Mr. Wilhelm, the adoption and retention of broadband communications among US users will entail significant investment in the telecom services and technology fields by venture capitalists as well as the federal and state governments. The lack of adoption could result in the exacerbation of the digital divide, especially in the education sector where students from schools which are not well-funded may fall behind in acquiring the skills and knowledge necessary to compete in higher education and (subsequent) job markets. On the other hand, the successful adoption of broadband communications could contribute an order of magnitude improvement in the quality of life for American citizens and further their nation’s leadership in the technology arena.
CSO Perspectives and SaaS Con report: Cloud Computing Security Remains a Conundrum
Abstract:
Prospective and existing cloud computing users often site security as one of their biggest concerns, particularly with public or hybrid clouds. The lack of standards for security, federated identity, and data handling integrity hasn’t done anything to alleviate those worries. For example, Software as a Service (SaaS) or Platform as a Service security contracts often lack contingency plans for what would happen if one or more of the companies involved suffer a disruption or data breach. And it’s not generally known, what type of security exists when data passes between clouds (private-to-public or public-to-public). There’s even talk of Virtual Private Clouds but no one really knows what that is either.
The enterprise customer, cloud providers and vendors are having difficulties in sorting out the many potential problems and resolving the finger pointing of who is responsible for what in the event of a data breach or other security trouble – especially over a shared infrastructure. In particular, there is no standard way of gathering the required information or isolating the problem in a multi-vendor cloud envirnoment. In fact, cascading security breaches are possible. That would really play havoc with cloud users data and apps.
Users and vendors are just starting to seriously examine these unresolved issues through industry associations, such as the year-old Cloud Security Alliance. So the Cloud Security related sessions at the co-located CSO Perspectives and SaaSCon conferences took on an increased sense of importance and urgency.
Conference Highlights:
1. Panelists at a joint session on Cloud Security made the following observations:
-Security problem isolation and prevention of cascading security breaches must be specified in the Cloud contract or SLA.
-The cloud vendor should log all inappropriate or unauthorized access incidents.
-The cloud security market needs to understand the nuances of data loss due to security breaches.
2. At a minimum, a Cloud Computing SLA should include:
a] Security of data, e.g. encryption mechanism
b] Up time/ availablity
c] Forensics of each security breach, especially across a shared infrastructure
d] Data portability to accomodate multiple vendor relationships
e] Being able to change the server OS (e.g Windows to Linux) without disrupting existing applications
f] Business continuity and contingency planning in the event of a falure(s)
3. The following items were said to be needed, but currently missing from the cloud computing environment:
a] Standards or Interoperablity Agreements
b] Benchmarks to compare cloud services with one another
c] Federation of identities to facilitate single sign on procedure for multiple inter-connected clouds.
4. Interesting quotes:
a] Jim Reavis, co-founder of the Cloud Security Alliance, said, “”It’s important we understand there isn’t just one cloud out there. It’s about layers of services,” Reavis said. “We’ve seen an evolution where SaaS providers ride atop the other layers, delivered in public and private clouds.” I believe the implication was that Infrastructure as a Service was layer 1 (the Data Center layer), Platform as a Service at layer 2 (the Application development/tools layer), and SaaS at layer 3 (or the Application run time layer)
b] Ed Bellis of on-line travel agency Orbitz said, “It’s a challenge, working with partners to get on same page. Early on there were many things we didn’t expect. Federation of identities in our internal systems became a challenge because of differences between our internal procedures and those of the SaaS provider.” “In your SLAs, you need to have clear language for how data will be handled and encrypted and, in the event of a security breach, the contract must have clear language on who is responsible for specific aspects of the investigation. Build these considerations into the contract side.”
c] Keith Waldorf, VP of operations at Doctor Dispense, a point-of-care on line medication and e-pharmacy provider, said one of his company’s most painful experiences in this area was on the contract side. “The lack of common standards really surprised us.” Waldorf said he once was a client of an (anonymous) cloud service provider that upgraded its offerings, but his company was unable to take advantage of the upgraded services because the original SLA locked him in to using only the software and hardware that was available at the time he initially signed the contract.
d] Jeff Spivey, president of Security Risk Management Inc., said “the vendors are driving the service, rather than the market defining its needs.” The previous day, Jeff presented on the threat of “black swan-like” security threats and cautioned the security oriented audience to monitor for “weak signals (of potential threats).”
5. Microsoft reiterates that they “are all in” with respect to Cloud Computing.
Tim O’Brien, Microsoft Platform Strategy Group manager said that what really matters is what cloud service based delivery can do for the customer. Microsoft will be moving “category leading products and platforms to the cloud. For example, Exchange Online (e-mail), SharePoint Online (collaboration), Dynamics CRM Online (business apps), SQL Azure (structured storage) and AD/Live ID (Active Directory access) as its lead services for businesses. All of these are designed to run on Windows Server 2008 in the data center and integrate with the corresponding on-premises applications. They will also work together with standard Microsoft client software, including Windows 7, Windows Phone, Office and Office Mobile.
In addition, the company is offering its own data centers and its own version of Infrastructure as a Service for hosting client enterprises’ apps and services. It is using Azure—a full online stack consisting of Windows 7, the SQL database and additional Web services—as a platform as a service for developers. Microsoft Online Services are up and running. They include Business Productivity Online Suite, Exchange Hosted Services, Microsoft Dynamics CRM Online and MS Office Web Apps. On the consumer side, Microsoft launched a cloud backup service called SkyDrive, soft-launched about two weeks ago. SkyDrive is an online storage repository for files that users can access from anywhere via the Web. The web edition of MS Office 2010 will be free to all Windows Live account holders this May. (We wonder how that will effect the company’s profits, which have always depended on the desktop sales of MS Office.
In summary, it’s clear that Microsoft has a comprehensive strategy is in place; users will now have to try the cloud based products and services and decide how integrated they really are.
The following from Tim O’Brien provides additional information and insight on Cloud Security and Web version of MS Office 2010:
Relative to cloud security, there are a number of resources you can access on our technical sites, some of which I’ve included here:
http://technet.microsoft.com/en-us/security/ee519613.aspx
“For Office, you simply sign into http://skydrive.live.com with your Windows Live ID, and you can use the document workspace for your Office docs, and view/edit them in the browser using the Office Web Apps (specifically, Word, Excel, PowerPoint, and OneNote). To create a file, you can click on “New” for a drop down menu of these four apps, and off you go…”
References:
1. Frustrations with cloud computing mount
– Lack of standards, industry agreements get more attention as industry expands
Cloud computing lacks standards about data handling and security practices, and there’s not even any agreement about whether a vendor has an obligation to tell users if their data is in the U.S. or not. And
The cloud computing industry has some of the characteristics of a Wild West boom town. But the local saloon’s name is Frustration. That’s the one word that seems to be popping up more and more in discussions about the cloud, particularly at the SaaScon 2010 conference here this week.
That frustration about the lack of standards grows as cloud-based services take root in enterprises. Take Orbitz LLC, the big travel company with multiple businesses that offer an increasingly broad range of services, such as scheduling golf tee times and booking concerts and cruises.
http://www.computerworld.com/s/article/9175102/Frustrations_with_cloud_computing_mount
2. SaaS, Security and the Cloud: It’s All About the Contract
-Security practitioners have learned the hard way that contract negotiations are critical if their SaaS, cloud and security goals are to work. A report from CSO Perspectives and SaaScon 2010.
Perhaps the most important lesson is that contract negotiations between providers is everything. The problem is that you don’t always know which questions to ask when the paperwork is being written. Panelists cited key problems in making the SaaS-Cloud-Security formula work: SaaS contracts often lack contingency plans for what would happen if one or more of the companies involved suffer a disruption or data breach. The partners — the enterprise customer and the vendors — rarely find it easy getting on the same page in terms of who is responsible for what in the event of trouble. Meanwhile, they say, there’s a lack of clear standards on how to proceed, especially when it comes to doing things in the cloud. Add to that the basic misunderstandings companies have on just what the cloud is all about, said Jim Reavis, co-founder of the Cloud Security Alliance. Somewhere in the mix, plenty can go wrong.
“If you’re in a public cloud situation and Company B is breached, a lot of finger pointing between that company and different partners will ensue,” Reavis said. “If this isn’t covered in the terms of agreement up front, you have no hope of recovering data (or damages).”
Security vendors can be part of the problem as well. In a recent CSO article about five mistakes one such vendor made in the cloud, Nils Puhlmann, co-founder of the Cloud Security Alliance and previously CISO for such entities as Electronic Arts and Robert Half International, noted that the vendor — who was not named — did “everything you can possibly do wrong” when rolling out the latest version of its SaaS product, leading to users uninstalling their solution in large numbers.
http://www.csoonline.com/article/589963/SaaS_Security_and_the_Cloud_It_s_All_About_the_Contract
3. Microsoft is moving ever deeper into the data center, exploring frontiers it hasn’t frequented in the past.
SANTA CLARA, Calif.—Only a year ago, the idea of Microsoft showing cloud computing services at an event like SaaSCon would not have computed one bit.
The world’s largest software company has been late to the party on a few things—the Internet being a classic example—but times and its corporate attitude have changed. They had to. Microsoft, whose executives not long ago were often quoted as hating cloud computing because it cuts directly into their core business, already has swallowed its pride to embrace open source—well, to a certain extent. The company also is trying to move deeper into the data center, exploring frontiers it hasn’t frequented in the past. At SaasCon 2010 here at the Santa Clara Convention Center April 6 and 7, Microsoft had its first booth dedicated strictly to business cloud services. It’s an ambitious plunge into a market already full of veteran players and bright newcomers alike.
4. A Tale of Two Clouds
The cloud is the answer to all our IT problems — from poor performance to lack of scale to high energy costs. The cloud is a sucker’s game that merely shifts responsibility for IT infrastructure to different hands, leads to performance issues of its own and leaves your data more open to theft. If both of those statements happened to be true — and we won’t know for sure until it starts to amass significant workloads — would that alter your plans to deploy cloud infrastructure in any way? Apparently not, if the latest research is to be believed.
One the one hand, we have reports from groups like Global Industry Analysts that predict the cloud services market is set to top $200 billion in the next five years. That would represent a blazingly fast growth curve, driven largely by enterprise needs to cut costs and expand capabilities in what is likely to be a mediocre economy at best. But it’s tough to square that level of acceptance with the increasing anecdotal evidence that suggests a large number of IT professionals are hesitant to place too much reliance on the cloud due to security concerns and a lack of interoperable standards.
http://www.itbusinessedge.com/cm/blogs/cole/a-tale-of-two-clouds/?cs=40604
The need for a Unified Set of Cloud Computing Standards within IEEE
From Kevin Walsh of UCSD:
I think that establishing a unified cloud standards framework under the auspices of a standards organization such as the IEEE deserves further discussion. (I know it would be well received by my government customer.)
Pointers below….
See –
http://standards.ieee.org/announcements/2009/pr_cloudsecuritystandards.html
http://www.elasticvapor.com/2009/11/iso-forms-group-for-cloud-computing.html
which is covered by this ISO section –
http://www.iso.org/iso/standards_development/technical_committees/other_bodies/iso_technical_committee.htm?commid=601355
The ACM is also in the mix with a conference planned for early June. See
http://research.microsoft.com/en-us/um/redmond/events/socc2010/index.htm
see the accepted papers thus far –
http://research.microsoft.com/en-us/um/redmond/events/socc2010/program.htm
In general, I like the IEEE process, and the organization is well respective from my point of view. Their standardization process is mature. See
http://standards.ieee.org/guides/opman/index.html
and
http://standards.ieee.org/resources/development/index.html
From Robert Grossman:
The url for the new IEEE Cloud Computing Standards Study Group:
http://www.computer.org/portal/web/standards/cloud
I’ll send a separate note to Steve Diamond, who is coordinating it.
I am looking forward to following up with you regarding the virtual networks effort.
There is also RTF research group on virtual networks called:
Virtual Networks Research Group (VNRG).
From Gary Mazzaferro:
I enjoyed your comments about the need for cloud standard initiatives. I”m of the same opinion and have slowly been moving towards a more collaborative initiative. The largest challenge is balancing time and funding. I have little time because of the lack of funding for the project. 🙂 I do have an idea how to make it work and gain participation by the user community
From InformationWeek:
The bigger message was that there is still much work to do in this area. With a ton of standards bodies emerging today, and vendors coming to market with their own unique APIs, it’s becoming difficult to have one voice.
If cloud is going to gain any kind of traction, let alone achieve the nirvana of the Inter-cloud, then we must have some level of standards in place to make it happen. As we’ve seen historically, not having standards in place has created challenges around interoperability, as well as vendor lock-in. The value proposition around cloud computing is negated if interoperability is not possible. It’s as simple as that. No ifs, ands, or buts.
http://www.informationweek.com/cloud-computing/blog/archives/2010/03/4_thoughts_from.html
NIST, a federal agency that has been instrumental in defining cloud computing, will take on an additional role as a central publisher of cloud use cases accompanied by a recommended reference technology implementation. “But the airing of strong use cases where a technology set is deemed suitable for a particular problem could lead to a specification for a standard, a NIST representative at the Cloud Connect show in Santa Clara, Calif., said Wednesday in an interview.”
http://www.informationweek.com/news/government/cloud-saas/showArticle.jhtml?articleID=224000007
Please feel free to leave a comment below or email me and I’ll include it
Here is a link to view what others have written about the Cloud Connect Conference I attended in Santa Clara, CA:
IDC Market Forecasts for Mobile Broadband and LTE
At the March 10th IDC 2010 Directions Conference in Santa Clara, IDC analysts Amy Lind and Carrie MacGillivray predicted a 32% compound annual growth rate (CAGR) for global mobile broadband connections, which were projected fo reach over 350M by 2013. [We wonder if those include M2M connections, which are potentially much larger than human held device connections].
More significantly, LTE was predicted to have a CAGR of 471%, with 2012 (and later for some countries) as the critical inflection point for LTE mass adoption. The technology was said to: offer improved capacity, full mobility (vs “mobile” WiMAX portability), be Initially oriented toward PCs with pricing In flux as operators continue to rethink their business models.
By 2013, IDC predicts:
- Mobile broadband will be ubiquitous and the defacto way of communicating
- Business models will be focused on revenues per subscriber or device
- Global mobile services spending will surpass $975 billion
- Iconic 4G devices will be critical to success
The two IDC analysts offered their essential guidance to session attendees:
- Wireless carriers should place emphasis on data services, which are essential for revenue growth.
- Detailed market segmentation is required to focus devices (and apps) on relevant audiences. To achieve this objective, IDC believes that wireless network operators will deepen partnerships with device and application vendors (AT&T and Clearwire are already doing this now).
- Integration key to staving off wireless displacement and driving mobile broadband adoption
In a separate presentation, IDC Research Manager Godfrey Chua was very optimistic about LTE. This author was stunned to hear Mr. Chua predict that LTE infrastructure equipment sales would overtake all WiMAX infrastructure sales by 4G 2011! That’s less than 18 months from now! According to Mr. Chua, both AT&T and VZW are looking to LTE to effectively deliver high quality mobile broadband service at the lowest cost per bit possible (through the more cost efficient OFDM based modulation and multi-carrier transport). He sees 2012-2013 as the LTE market inflection point, which is consistent with the opinion of other IDC Analysts. Why have all the major global cellular operators made such an early committment to LTE? Here are a few reasons given:
- To deliver high quality mobile broadband at the lowest cost per bit
- To relieve 3G capacity pressure by migrating laptop users to LTE
- To create a more robust platform for applications and services –that lead to new business models and therefore revenue streams
Godfrey next compared the rationale and position of LTE (vs WiMAX):
- To address capacity pressure in 3G networks (vs WiMAX to address underserved broadband connectivity demand)
- Full mobility is the value proposition (vs WiMAX portablity of netbooks/notebook access)
- Geared towards developed markets (vs WiMAX orientation toward emerging markets)
- Relevance to emerging markets not until 2015 (vs WiMAX being always relevent to emerging markets)
2010 will be a critical year for LTE network equipment companies as they all seek to build momentum. in the forthcoming global market. Mr. Chua sees Ericsson and Huawei as early leaders in providing LTE gear. He says that Alcatel – Lucent’s Verizon Wireless win is key, but now they must convert trials into contracts. Meanwhile, Nokia Siemens Networks is looking to maintain relevance in LTE. The competitive pressure will surely intensify as other players –Motorola, ZTE, NEC and Fujitsu –seek to up the competitive ante.
In closing, Godfrey offered the following essential guidance:
- Realization of the long-held vision for the network is near
- Mobile data traffic will continue to explode
- Network transformation is critical, it is key to remaining competitive
- Green efforts will persist, it goes hand in hand with the transformation process
- Vendor positions will continue to shift
Some additional predictions from IDC Analysts:
John Gantz, Chief Research Analyst:
-By the end of 2010, there will be 1B mobile Internet users and 500K mobile phone apps. 1.2 billion mobile
phones will be sold; 220 million smart phones. 630 million laptops in place; 80 million netbooks.
–There will be many intellgent devices communicating with machines/computers. M3M is a potential high growth area.
-Complexity will increase 10X in the next 10 years
-By 2020, there will be 31B connected devices, 2.6 billion phones, 25M apps, 450B interactions per day, 1.3T tags/sensors
Rick Nicholson, Vice President, IDC Energy Insights:
Workshop Report: Clearwire on track with rollouts and app tools, but MSO partners struggle with Business Models
Disclaimer: Unlike many “would be journalists” that are either always negative on WiMAX, or are perennial Pollyannas that produce an endless stream of recycled “happy talk,” this author tries to be balanced and objective of WiMAX in general and the WiMAX events covered in particular. We have been covering WiMAX for over 6 years now, with more than 200 published articles on that technology. This author has no business relationships with Clearwire or any other WiMAX related company or entity. Please read on……
Introduction
Clearwire briefed potential application developers at a well attended CLEAR Developer workshop in Santa Clara, CA on March 2, 2010. The key sessions were Upcoming 4G WiMAX APIs and Tools, The 4G WiMAX Business Opportunity for Developers, and the wrap up session revealing where Clearwire is now and where they’re going. You can find all the Sessions and speakers here.
We will skip the discussion of WiMAX APIs and Tools, which was already covered in detail at the Feb 10th IEEE ComSoc SCV meeting (you can access the slides at: http://www.ewh.ieee.org/r6/scv/comsoc/Talk_021010_CLEARDeveloperOverview.pdf).
Nonetheless, we noticed a lot of keen interest amongst developers who were accessing Clearwire’s Silicon Valley 4G Innovation Network using 4G USB sticks attached to their notebook PCs. It seems indoor coveraged worked fine in the Santa Clara Convention Center, where the workshop was held.
However, we were quite disappointed that neither Comcast or TW Cable had any new services to tell us about, despite the video content and managed networks they each own. More about this later in the article.
The Wholesale Opportunity
Clearwire (CLRW) was said to own more licensed spectrum in major cities than any other wireless network operator. Their “4G”+ network, known as CLEAR, is now covering more than 34 million points of presence (POPs) as of 4Q-2009. It’s also commercially available in 28 different U.S. cities including Seattle, Honolulu and Maui. CLRW plans to build out their mobile WiMAX network to reach 120 million POPs by end of 2010. They’ll have launched CLEAR service in most major U.S. cities by the end of the year including New York, San Francisco, Boston, Houston, Kansas City and Washington, DC. By this time next year, the CLEAR network will stretch from coast to coast and cover all the major U.S. cities.
In addition to selling “4G” fixed and mobile wireless broadband Internet access, Clearwire has MVNO (wholesale) agreeements with three of their large investors –Sprint, Comcast, TW Cable– who are reselling the service under their respective brand names. These partners were said to have a combined customer base of approximately 75M subscribers and their well known brand names would help the combined entities achieve a critical mass of customers much quicker than if only Clearwire was selling WiMAX services. Wholesale resellers will also drive WiMAX ecosystem development and investment, according to Randy Dunbar, Vice President, Wholesale Marketing & Strategy, Clearwire.
Mr. Dunbar told the audience that Clearwire will be signing up more MVNO resellers in the near future. These may include companies involved in: consumer electronics, retailers, CLECs, pre-paid/targeted market segments, smart grid and M2M (least understood by Clearwire, but with tremendous potential). The new resellers will help Mobile WiMAX deployment in diverse market segments such as: mobile consumer, home entertainment, power Internet user, SOHO, small business, large enterprise, vertical business’, road warriors (business travellers).
Currently, there is only one known hand held device available for CLEAR -the Samsung Modi. “4G” access is currently obtained using an external USB modem or “dongle,” embedded WiMAX in a PC, or a “personal” WiFi hotspot (many of which require an external USB dongle to access the WiMAX network). But Mr. Dunbar said that a “range of connected devices” are coming for CLEAR. These devices include: smart phones, STBs, DVR, mobile modems, MIDs, Consumer Electronics gadgets (such as portable media players). Randy hit my hot button when he stated that programmed video and time/place shifted video would be delivered via the 4G CLEAR network (see next section of this article).
+ IEEE 802.16e-2005 based Mobile WiMAX (being deployed by Clearwire and partners) is actually 3G according to the ITU-R; IEEE 802.16m will be the 4G version of mobile WiMAX, but Clearwire has not committed to that yet.
Cable (MSO) MVNOs reselling Clearwire’s mobile WiMAX network
Comcast, the largest MSO in the U.S., resells the CLEAR network as “Hi Speed to Go.” It’s branded mobile WiMAX service is available in Portland, Atlanta, Chicago, Philadelphia, Seattle/Bellingham area. Katie Graham, Director, Wireless Business Development said there were two ways mobile WiMAX could be purchased from Comcast:
- Fast pack: Cable Internet (home access) bundled with High Speed to Go
- Bolt on: 4G mobile WiMAX only or 3G/4G (using Sprint’s EVDO network for 3G)
A free WiFi router is included with a Hi Speed to Go subscription. More details on the Comcast mobile WiMAX service is at: http://www.comcast.com/highspeed2Go/#/highspeed2go
TW Cable has been completely spun off from Time Warner as a separate company (which means they don’t own any video content). Their CEO had recently stated that high speed Internet was replacing video as the firm’s core product. TW Cable currently serves 14.6M customers in 28 states. They claim to be the third largest broadband ISP in the U.S. with 9M subscribers. Brian Coughlin, Manager, Wireless Platforms for TW Cable told the audience that data oriented wireless products and services would be first priority for the company, with voice and mobile phones later. Brian stated that “Digtial media and service must be adaptable” and that an ecosystem would be required for this. I took this to mean that digital media and video services needed to be able to adapt to broadband access via mobile WiMAX, but I was wrong (see below for the reason).
The two MSO behemoths were asked by this author why they haven’t offered any premium video services or VoD over mobile WiMAX and they appeared to be stumped. Some of the explanations given were:
“The technology is ahead of the business models.” Clearwire
“The industry hasn’t figured out how to monetize the video applications.” TW Cable and Clearwire
“It’s definitely on our radar screen, but we don’t have anything we can announce at this time.” TW Cable
“Digital content rights are based on a given device, not on a service.” Comcast
We were preplexed by these statements. In particular, we do not understand why Comcast can offer On Demand Digital Video* over their managed network and cable Internet service, but not over mobile WiMAX.
* For details on Comcast On Demand On line service please visit:
http://www.comcast.net/on-demand-online/
Kittar Nagesh, Service Provider Marketing Manager at Cisco also participated in this panel. He made three statements I thought were very important:
- “Video will be 66% of mobile video traffic by 2013.”
- “The spectrum Clearwire owns is remarkably important. It’s important to make use of the spectrum (a wireless network operator) you have. It doesn’t matter if it’s used for WiMAX or LTE.”
- “M2M applications will be phenomenally important. It will be an inflection point (for the broadband wireless industry). Innovation will explode in an unbounded fashion.”
Shortly after this event, Cisco withdrew from the WiMAX RAN equipment market. They had been selling WiMAX base stations (from the Navini acquisitiion), but they now think there are better opportunities in the mobile packet core via their acquisition of Starent Networks).
Wrap Up Session: Clearwire now and in the near future
Dow Draper, Clearwire Vice President for Product Development and Innovation, told the audience that the average Clearwire customer is using 7G bytes of downloaded data per month — a number that Clearwire only expects to increase over time. That compares with an average 3G data card download of 1.4G bytes/month and an iPhone 3G average download of 200 M bytes/month.
Mr. Draper also said that the S.F. Bay Area can expect commercial WiMAX service by “late 2010,” and that “multiple smart phones” would be running on the Clearwire network before year’s end. Dow also hinted at other upcoming devices for CLEAR: MIDs, Portable Media Players (PMPs), tablets and embedded devices. He distinguished between category 1 devices which are tested and sold by Clearwire and category 2 devices which are sold through channels (and presumably retail stores).
“Clearwire will support multiple Operating Systems, especially Android,” said Mr. Draper. In summing up he said that thrid party developers, differentated devices, services, and applications are all critical in attracting customers for Clearwire and their MVNO resellers. While we completely agree with that statement, we think that the devices need to come to market very quickly (they’ve been promised for quite some time by Intel but haven’t materialized). But even more important are the differentiated services, such as video- either for entertainment, education, or surveillance.
Next Clearwire workshop:
4-G WiMAX Developers Symposium, Jun 15 10:00AM to 5:00PM Stanford University
Topics Include:
- The latest on 4G WiMAX API’s and tools
- 4G WiMAX 101 basics for developers & network and device architects
- Market opportunities for 4G developers with symposium sponsors: Clear, Time Warner Cable, Sprint, Intel, Comcast, Cisco
- Business sessions from leading 4G industry executives
- 4G trends and forecasts
- Open discussion on the future of mobile internet innovation
Details at: http://scpd.stanford.edu/search/publicCourseSearchDetails.do?method=load&courseId=6650469
Developer Opportunities with CLEAR WiMAX 4G: Clearwire’s initiative for apps development for WiMax systems
IEEE ComSoc SCV February 10th, 2010 Meeting Report
“Developer Opportunities with CLEAR WiMAX 4G”
This very informative and exciting meeting drew a large audience of over 85 attendees. Those present heard directly from Clearwire’s about their Silicon Valley Innovation Network (based on IEEE 801.16e-2005/Mobile WiMAX) and tools that are being made available for CLEAR 4G applications developers. Representing Clearwire were Allen Flanagan, Manager, Silicon Valley Innovation Network, and David Rees, Manager, Developer and Partner Enablement. The event featured excellent presentation by both speakers followed by Q&A as well as a panel discussion where meeting attendees were able to ask questions not already covered in the talks. The presentations and subsequent Q&A were moderated by Sameer Herlekar, ComSoc SCV Technical Activities Director, while the panel discussion was chaired by Alan Weissberger, ComSoc SCV Chairman.
Presentation Highlights:
David Rees, in his talk on “Application Developer Enablement,” first provided some background on Clearwire and its network. Clearwire was said to be a broadband wireless network provider currently operating in 25 markets and reaching 30 million people with the potential to cover 80 markets and reaching 120 million people by the end of 2010. (Note that “people” here refers to the total populace in these markets, not the actual number of Clearwire customers.) Dave told us that Clearwire’s “4G” WiMax network (following the IEEE 802.16e-2005 standard) operates nationwide in the 2.5 GHz spectral band with an average of 150 MHz spectrum available per market. The principal funding sources for Clearwire’s 4G network deployment plans include Sprint, Comcast, Intel, Time Warner, Google and Bright House Networks. In fact, Clearwire received over $3.2 billion in funding and resources from these sources in 2008 alone. Additionally, the talk elaborated on the benefits of the WiMax-based 4G network over the contemporary 3G networks based on EV-DO and HSPA in terms of higher data rates and spectrum capacities, stronger device mobility support, and amenability for low-cost deployment via an all-IP network. Specifically, based on results from drive-testing their network in Portland, OR over a 17-mile distance at an average speed of 35mph and a top speed of 55mph, Clearwire claims that their network’s performance is an order of magnitude superior to the 3G networks over the same route in terms of peak and mean data rates as well as network latency. Clearwire’s teams attained a peak data rate of 19 Mbps, a mean data rate of 6.5 Mbps and a mean latency of 83 ms during the drive testing.
Mr. Rees then shifted to Clearwire’s plans for Application Developer Enablement, i.e., the platform which enables (or, to be precise, which Clearwire seeks to enable) application developers and OEMs devise applications and services which leverage the enhanced speed and capacities of the 4G network. David Rees then explained Clearwire’s philosophy of an “open network for open devices”, where any WiMax-enabled device including those based in MIDs, camcorders, netbooks and smartphones will be able to access the 4G WiMax network. In fact, the devices do not even have to be necessarily provided by the carriers. However, they will need to be certified by the WiMax forum before they can be considered admissible on Clearwire’s 4G network.
The presentation then explained how, in order to support the “open network for open devices” paradigm and the mobile internet applications operating thereupon, appropriate application program interfaces (APIs) need to be made available to developers and service providers. These APIs need to be able to access the mobile device’s location information as well as be aware of the network itself, thereby providing a superior quality of experience to the end-user. Clearwire seeks to provide location information via APIs that employ a client/server service whereby applications can determine their own locations (“where am I?”), a server/server service which enables geo-fencing or tracking (“where are they?”), enabling location in browsers themselves including Chrome, Firefox and IE, and also by working with existing location providers to use WiMax for reporting location information. For example, a lightweight JSON/HTTP service allows client applications including browsers to query their locations and obtain their latitude/longitude information from the server. Clearwire is also working on adding direct support for Google Gears, Firefox and other browsers. Similarly, the server/server service employs the device’s IP address or MAC address to determine the device’s location. Network-awareness will be provided via Session Information and includes enhanced location awareness via triangulation, radio-signal quality information, as well as diagnostics collection and reporting such as round-trip time to each neighbor (future enhancement) and neighboring sector information (also a future enhancement). The session information is sought to be provided by a single, or common API, the so-called CAPI, which Clearwire is currently standardizing and promoting. As a matter of fact, CAPI 1.2.1 has already been implemented on a variety of chipsets including Intel’s WiMax-enabled laptops. Moreover, CAPI 2.0, which is currently being standardized, will (when available commercially) include such features as neighboring sector information, handoff notification and a list of applications requesting a particular quality of service (QoS).
Suggesting that Clearwire views mobile video as a major application for the 4G network-enabled mobile internet, the talk elaborated on Clearwire’s efforts to ensure a high-quality end-user experience by supporting in-network video optimization as well as session information provision to video clients, video servers and video service providers to help them optimize their transmission/reception performances.
The final part of the Developer Enablement portion of the presentation focused on QoS and its current status vis-à-vis Clearwire’s 4G network deployments. The talk explained that traffic streams such as video are specified to have a particular service-level in terms of throughput, latency (network delay) and jitter (delay variation). The network then identifies these streams and attempts to provide the desired service-level to each stream with a goal of minimizing the likelihood of network congestion while concurrently supporting different service-levels. At this time, while QoS has been implemented in Clearwire’s network, it will be offered to users only after Clearwire concludes its ongoing open-internet NPRM talks with the FCC and it’s (Clearwire’s) partners on when QoS-on-demand should be formally deployed. Additionally, Clearwire is yet to finalize its strategy with respect to load balancing (balancing QoS demands from a plurality of end-users), as well as network management once full QoS-on-demand becomes available, possibly in the third quarter of this year.
The Developer Resources section of the talk was presented by Allen Flanagan, where the 4G WiMax Innovation Network, also known as CLEAR was explained in detail. According to the presentation, CLEAR is a pre-commercial network deployed in parts of Silicon Valley including Palo Alto and Menlo Park, Mountain View and Santa Clara for testing mobile device-based applications. Registered service, application and content developers may make use of this service for free for the duration of the program. Noting that the CLEAR program encourages app developers to create applications which take advantage of the mobility aspect of the device, the talk pointed out that the Innovation Network is, however, not a testbed for hardware nor is it intended for end-users. When invited to suggest an example app for the 4G Innovators Network, Allen Flanagan outlined an application for public safety teams which could perform voice recognition and possibly translation. Such an app could be very useful for disaster-hit areas where local first-responders need to communicate with emergency workers who may not speak the native language. The talk concluded with an invitation to the audience members to receive free passes to the 4G WiMax Developer Workshop to be held in Santa Clara Convention Center on March 2nd of this year.
Panel Session:
The panel session, moderated by Alan Weissberger, addressed a number of issues which were raised even during the Q&A sessions following the speaker presentations. Both the Clearwire reps provided a wide-range of information on device and app certification, locations (stores) where the apps may be obtained, as well as the all-important question of QoS. During the panel discussion, audience members came to learn that:
There will be no special app store for CLEAR-developed applications. Clearwire welcomes apps purchased through stores belonging to their partners like Intel, Google, Apple and Palm.
Clearwire certifies devices approved by the WiMax forum and will certify only those apps which are provided by Clearwire; any other developed apps will not be certified by them.
Clearwire’s handheld WiMax device will become available later this year.
Network usage and available capacity will be monitored over time and adjustment of capacity will be made based on backhaul traffic statistics. While real-time capacity adjustments are not possible, Clearwire will leverage the 150 MHz of available spectrum to help meet any projected increase in user demand for capacity.
When pressed on whether Clearwire’s business model takes into account a point of network failure due to conflicting user demands for QoS, or whether Clearwire can guarantee that a fixed QoS will be supported by their network for a (statistical) percentage of communicating devices for a (statistical) fraction of the time, the Clearwire reps acknowledged that the QoS-on-demand issue is a major challenge to network planners and that they will relay the question to the their engineering teams as developer feedback.
When asked whether Clearwire’s future WiMAX enabled phone would use mobile VoIP or cellular voice, Clearwire’s Allen Flanagan understandably refused to answer as the product has not been officially announced. For speculation on what type of voice would be used on WiMAX handsets, please see:
http://viodi.com/2010/02/16/wimax-handsets-for-clear-vs-mobile-skype-over-vzw/
Analysis and Opinion:
The 4G WiMax network being deployed by Clearwire as a means to enable the mobile internet is certainly a very strong response to the tremendous attention being accorded to LTE in recent months. Additionally, the 4G Innovation Network program, a.k.a. CLEAR is a clever approach to attract app and content developers by providing them with free access to the WiMax network prior to any commercial deployments. That the CLEAR program targets Silicon Valley is not surprising either, given the vast number of hi-tech innovations which have been born in the valley. With regard to air interfaces, Clearwire’s drive-testing results showed that the OFDM-based WiMax combined with 150 MHz of available spectrum can achieve significant mobility support for major bandwidth-hungry applications such as mobile video (we note, however, that Clearwire’s drive-testing was undertaken at off-peak hours to minimize any loading effects).
Many industry experts predict that mobile video is the killer application for the true mobile internet. This viewpoint does not seem to have been lost on Clearwire, given their emphasis on supporting mobile video on their 4G network. In order to support apps which feature mobile video as well as other applications, Clearwire will provide APIs which support RF awareness, location awareness and network awareness. On the crucial issue of QoS, Clearwire has enabled QoS in it’s networks even though it is not yet available to end-users, pending talks between Clearwire, it’s partners and the FCC.
The solid attendance for this meeting and the number and variety of questions raised by the attendees points to great enthusiasm among potential app and content developers to leverage the promise of Clearwire’s 4G network.
Besides the current unavailability of QoS guarantees to support applications like mobile video, the network’s ability to support multiple bandwidth-hungry applications from multiple users (numbering in the hundreds of thousands in some markets) is still an open question which Clearwire’s engineering teams will no doubt be actively engaged in to answer. However, Clearwire has undertaken notable steps in its quest to acquire a lion’s share of the mobile internet network operator market including enticing potential app and content developers and service providers to develop a plethora of mobile apps via the CLEAR program. It is now up to developers to unleash their creativity and devise apps which are innovative, timely and which truly provide end-users with a superior quality of experience on their mobile devices. Simultaneously, Clearwire needs to live up to it’s promise of providing a network which supports high mobility and highly differentiated network services including QoS-on-demand for an overall customer base which potentially numbers in the millions.
References:
Here are a few links to articles, written by Alan Weissberger, which describe Clearwire’s Innovation Network and Developers program:
http://www.wimax360.com/profiles/blogs/archived-4g-webcasts-and
GSA Silicon Series Seminar: New Markets, New Economics, Feb 10, 2010 Santa Clara, CA
This very informative, analyst – only panel discussion assessed the outlook for new and growing markets throughout the semiconductor industry. While several markets were covered, this summary will focus exclusively on the hot communications related markets and products. During the meeting, analysts from FBR Capital Markets, iSuppli, Databeans and Gartner shared their perspectives and predictions on what’s hot and why in several markets of interest to communications and network practitioners.
Craig Berger of FBR Capital Markets singled out smart phones and Set Top Boxes (STBs) as key semiconductor industry drivers. He said that China’s 3G infrastructure build-outs would have ripple effects across the industry and accelerate in late 2010. China is expected to spend over $60B in 3G related telecom equipment over the next two years. That’s certainly impressive! Craig opined that he expected India to "ramp up" 3G network production, but didn’t say when (India’s 3G licensed spectrum auctions have been postponed for well over one year now). In addition to 3G, Mr. Berger commented that wireless interfaces, such as WiFi, Bluetooth, and GPS, were propogating much more broadly into hand held devices that didn’t have much or any IC content previously.
FBR see a continued proliferation of smartphones, with +15% growth in 2009 and +25% expected in 2010. Handset chip content is actually increasing versus recent years. Other gadgets are expected to do well – Tablets, eBook readers, and MIDs. Chip makers that will benefit from this advanced wireless handheld trend are Qualcomm, Infineon, Broadcom, Marvell, Nvidia, Intel, and STMicro, according to Craig.
Distributed computing is another trend to watch. As broadband becomes faster and increasingly available, PCs do not need to have large amounts of on board storage, expensive operating systems and processors. Instead, they can be ‘dumb terminals‘ that just interface to Internet or cloud computing based servers.
Success is created when products are specifically designed to address a particular market niche. For example, Bluetooth is really 20 different sub-markets, depending on application and usage models. Broadcom has done paricularly well in these sub markets by offering a wide variety of components and combo chips that address the different requirements of the sub markets.
To read the rest of this article go to: http://viodi.com/2010/02/19/1718/
Femtocells & Relays in Advanced Wireless Networks
Mobile Packet Core + BWA India talks and panel discussions report, IEEE ComSoc SCV chapter Jan 13, 2010 meeting
Introduction:
IEEE ComSoc SCV chapter meeting held on January 13, 2010 offered two hot topics of interest to attendees: the Mobile Packet Core (MPC) for 3G and 4G wireless networks and the outlook for Broadband Wireless Access (BWA) in India. There were two presentations followed by a panel session which covered both the mobile packet core and BWA in India. This IEEE ComSoc SCV chapter meeting was chaired by Sameer Herlekar, ComSoc SCV Technical Activities Director. The meeting was well-attended, with the 72 people in attendance receiving the speakers’ presentations and panelists’ views with enthusiasm and often engaging the panelists in a lively debate on the issues involving MPC and BWA India.
Jay Iyer, Distinguished Engineer at Cisco Systems and Eric Andrews, VP of Product Management for WiChorus/Tellabs Company (who replaced Rehan Jalil, Senior Vice President, Mobile Internet, Tellabs) were the speakers who presented their respective companies’ perspectives on MPC and later participated in the panel discussion on the same subject. They were joined on the panel discussion by Arpit Joshipura, Vice-President, Strategy & Market Development, at Ericsson Silicon Valley who articulated Ericsson’s views on MPC as well as provided his views on BWA in India.
Presentation Highlights
Presenting the first talk of the evening titled “The Mobile Internet Edge”, Jay Iyer elaborated on how intelligent networking built on an all-IP network foundation can help mobile broadband service providers monetize their investments while, at the same time, deliver a high quality of mobile experience to their subscribers. The user (subscribers’) experience will be marked by a rich suite of services including mobile video, enhanced voice and messaging, cloud-based services, other personalized services and featuring seamless interoperability over an array of devices. The talk also addressed how Cisco’s all-IP next generation network (NGN) architecture, named IP NGN 2.0 enables the ubiquity of these services. Noting that the mobile internet data traffic is projected to grow sixty-six fold by 2013-14, the presentation asserted that 64% of the mobile internet traffic by 2013 will be composed of mobile video, with speed of services and quality of service the key to user satisfaction and operator revenue generation. Additionally, the talk described the business models for new markets and services including collaboration cloud services and machine to machine (M2M) application services.
The second presentation of the evening, by Eric Andrews, was titled “Mobile Packet Core Trends” and focused on smart 4G packet cores for the mobile Internet. The talk observed that mobile operators the world-over are facing a severe shortfall of revenue following their introduction of mobile data services such as HSPA, in spite of actual mobile data traffic seeing an exponential increase over the same period. However, the talk also noted that the 103 million HSPA subscribers (as of Q4 2008) represented only 2% of the total mobile service subscribers. In order to meet this explosive mobile Internet data traffic growth, the mobile packet cores form a crucial network component within the new flat all-IP network architecture. Furthermore, the talk explained why smart 4G packet cores are well-positioned to not only handle the exponential volumes of data traffic (thereby enhancing the mobile subscribers’ quality of experience), but also enable network optimization and content monetization for the service operators’ benefit. Content and application awareness at the network, geographic and user levels, and a QoS guarantee enforced on identified content define the smart quality of these 4G packet cores. Additionally, the presentation elaborated on how a distributed core network architecture with internet offload at the network edge results in significant OPEX and CAPEX savings to the service operators while causing minimal operational impact on the network itself..
Panel Discussion
The three panelists concurred that the mobile Internet itself is the killer application for wireless communications, and that serious efforts need to be made towards extending the internet to help meet growing user demand for the mobile internet. Furthermore, the panelists asserted that the mobile packet core, in conjunction with either WiMax or LTE as the radio access technology is an ideal means to handle the exponentially-large volumes of data traffic expected from the mobile internet users. Eric Andrews noted that mobile users are demanding ever-increasing bandwidth for their mobile device applications and that the available bandwidth is never enough. This has been particularly true for the current state of the mobile internet, where the amount of bandwidth available to users has been far below the user’s expectations. A controversy erupted between a few of the audience members and the panelists on the question of whether off-portal services were threatening the traditional mobile operator business model and the walled-garden services. The audience members expressed their view that mobile operators have been restricting subscribers from accessing the services which they (subscribers) truly want while the panelists responded by stating their own case from the business angle.
The subject of BWA in India was then introduced by the panel moderator, Sameer Herlekar, who proffered his views on the topic, and was followed by each of the panelists who provided their own perspectives. Sameer Herlekar observed that the Indian government has initiated the BWA project in an effort to connect people in the rural areas of India to the Internet by leveraging wireless communication via WiMax as the transmission medium. This approach was mainly due to the inadequate reach of copper phone lines to the underdeveloped and remote parts of India. The three panelists observed that India is the single-largest WiMax market in the world and therefore offers mobile operators an invaluable business opportunity. However, according to Arpit Joshipura, the BWA in India will have to be deployed in the so-called metro (urban) areas first, and consolidation will need to take place among the operators before the rural markets can be tapped profitably. The panel and moderator also noted that bureaucratic hurdles are a serious impediment to the deployment of BWA in India, including the spectrum auction date which is still undecided despite a delay of over one year.
Analysis:
The opening of the airwaves to the Internet by mobile broadband is expected to spawn a host of new applications, user interfaces, services and technologies geared for mobile Internet subscribers. These applications and technologies promise mobile subscribers a whole new experience of the Internet by truly extending the Internet’s reach well beyond their desktops or laptops to their very person. Meanwhile, the mobile packet core (MPC) pledges to not only help the mobile operators monetize the tidal wave of mobile Internet data traffic expected to hit the backhaul networks, but also enable a brand-new user experience built on highly-personalized and innovative applications running on mobile devices. Companies like Cisco via their acquisition of Starent and Tellabs through their purchase of WiChorus are preparing for the aforementioned data traffic tidal wave, which has already begun. At the moment, MPC promises a win-win situation for both operators (from a business angle) and for subscribers for their Internet experience. If successful, the mobile Internet enabled by MPC and 4G radio architectures like LTE and WiMax could ultimately change the quality of our lives by providing true Internet access on-the-go. However, only time will tell if these prophecies eventually turn out to be as true as the operators are betting on, and as the mobile subscribers are looking forward to.
That the BWA market in India is potentially the single-largest WiMax market in the world is not in question. Please see this article by Alan J Weissberger for corroboration: Study Predicts India to be Largest WiMAX market in Asia Pacific by 2013
Additionally, the Indian government’s belief that the far-flung areas of India can be connected to the Internet wirelessly is reasonable, if one is to consider the astounding success of the cellular phone services in India where 442 million subscribers currently enjoy the services and, on an average, 10 million new subscribers have been added per month over the last 2 years. However, the promise of monetizing the Indian BWA market will depend on when the BWA spectrum is auctioned and allocated to operators, which, as of the time of the writing of this article, was still not finalized. Furthermore, as Alan Weissberger has noted, the BWA deployment in India will feature WiMax for fixed broadband wireless access rather than true mobile WiMax, as Indian mobile operators plan to use 3G services for mobile broadband access. This result is rather surprising, as one would expect the WiMax supporters to have left no stone unturned to leverage the vast Indian mobile broadband access market potential and showcase mobile WiMax’s capabilities versus LTE.