Everyone agrees that Generative AI has great promise and potential. Martin Casado of Andreessen Horowitz recently wrote in the Wall Street Journal that the technology has “finally become transformative:”
“Generative AI can bring real economic benefits to large industries with established and expensive workloads. Large language models could save costs by performing tasks such as summarizing discovery documents without replacing attorneys, to take one example. And there are plenty of similar jobs spread across fields like medicine, computer programming, design and entertainment….. This all means opportunity for the new class of generative AI startups to evolve along with users, while incumbents focus on applying the technology to their existing cash-cow business lines.”
A new investment wave caused by generative AI is starting to loom among cloud service providers, raising questions about whether Big Tech’s spending cutbacks and layoffs will prove to be short lived. Pressed to say when they would see a revenue lift from AI, the big U.S. cloud companies (Microsoft, Alphabet/Google, Meta/FB and Amazon) all referred to existing services that rely heavily on investments made in the past. These range from the AWS’s machine learning services for cloud customers to AI-enhanced tools that Google and Meta offer to their advertising customers.
Microsoft offered only a cautious prediction of when AI would result in higher revenue. Amy Hood, chief financial officer, told investors during an earnings call last week that the revenue impact would be “gradual,” as the features are launched and start to catch on with customers. The caution failed to match high expectations ahead of the company’s earnings, wiping 7% off its stock price (MSFT ticker symbol) over the following week.
When it comes to the newer generative AI wave, predictions were few and far between. Amazon CEO Andy Jassy said on Thursday that the technology was in its “very early stages” and that the industry was only “a few steps into a marathon”. Many customers of Amazon’s cloud arm, AWS, see the technology as transformative, Jassy noted that “most companies are still figuring out how they want to approach it, they are figuring out how to train models.” He insisted that every part of Amazon’s business was working on generative AI initiatives and the technology was “going to be at the heart of what we do.”
There are a number of large language models that power generative AI, and many of the AI companies that make them have forged partnerships with big cloud service providers. As business technology leaders make their picks among them, they are weighing the risks and benefits of using one cloud provider’s AI ecosystem. They say it is an important decision that could have long-term consequences, including how much they spend and whether they are willing to sink deeper into one cloud provider’s set of software, tools, and services.
To date, AI large language model makers like OpenAI, Anthropic, and Cohere have led the charge in developing proprietary large language models that companies are using to boost efficiency in areas like accounting and writing code, or adding to their own products with tools like custom chatbots. Partnerships between model makers and major cloud companies include OpenAI and Microsoft Azure, Anthropic and Cohere with Google Cloud, and the machine-learning startup Hugging Face with Amazon Web Services. Databricks, a data storage and management company, agreed to buy the generative AI startup MosaicML in June.
If a company chooses a single AI ecosystem, it could risk “vendor lock-in” within that provider’s platform and set of services, said Ram Chakravarti, chief technology officer of Houston-based BMC Software. This paradigm is a recurring one, where a business’s IT system, software and data all sit within one digital platform, and it could become more pronounced as companies look for help in using generative AI. Companies say the problem with vendor lock-in, especially among cloud providers, is that they have difficulty moving their data to other platforms, lose negotiating power with other vendors, and must rely on one provider to keep its services online and secure.
Cloud providers, partly in response to complaints of lock-in, now offer tools to help customers move data between their own and competitors’ platforms. Businesses have increasingly signed up with more than one cloud provider to reduce their reliance on any single vendor. That is the strategy companies could end up taking with generative AI, where by using a “multiple generative AI approach,” they can avoid getting too entrenched in a particular platform. To be sure, many chief information officers have said they willingly accept such risks for the convenience, and potentially lower cost, of working with a single technology vendor or cloud provider.
A significant challenge in incorporating generative AI is that the technology is changing so quickly, analysts have said, forcing CIOs to not only keep up with the pace of innovation, but also sift through potential data privacy and cybersecurity risks.
A company using its cloud provider’s premade tools and services, plus guardrails for protecting company data and reducing inaccurate outputs, can more quickly implement generative AI off-the-shelf, said Adnan Masood, chief AI architect at digital technology and IT services firm UST. “It has privacy, it has security, it has all the compliance elements in there. At that point, people don’t really have to worry so much about the logistics of things, but rather are focused on utilizing the model.”
For other companies, it is a conservative approach to use generative AI with a large cloud platform they already trust to hold sensitive company data, said Jon Turow, a partner at Madrona Venture Group. “It’s a very natural start to a conversation to say, ‘Hey, would you also like to apply AI inside my four walls?’”
“Right now, the evidence is a little bit scarce about what the effect on revenue will be across the tech industry,” said James Tierney of Alliance Bernstein.
Brent Thill, an analyst at Jefferies, summed up the mood among investors: “The hype is here, the revenue is not. Behind the scenes, the whole industry is scrambling to figure out the business model [for generative AI]: how are we going to price it? How are we going to sell it?”
Facebook Connectivity works with partners to develop these technologies and bring them to people across the world. Since 2013, Facebook Connectivity has accelerated access to a faster internet for more than 300M people around the world. Earlier this week, during an event called Inside the Lab, our engineers shared the latest developments on some of our connectivity technologies, which aim to improve internet capacity across the world by sea, land and air:
- Subsea cables connect continents and are the backbone of the global internet. Our first-ever transatlantic subsea cable system will connect Europe to the U.S. This new cable provides 200X more internet capacity than the transatlantic cables of the 2000s. This investment builds on other recent subsea expansions, including 2Africa PEARLS which will be the longest subsea cable system in the world connecting Africa, Europe and Asia.
- To slash the time and cost required to roll out fiber-optic internet to communities, Facebook developed a robot called Bombyx that moves along power lines, wrapping them with fiber cable. Since we first unveiled Bombyx, it has become lighter, faster and more agile, and we believe it could have a radical effect on the economics of fiber deployment around the world.
- Facebook also developed Terragraph, a wireless technology that delivers internet at fiber speed over the air. This technology has already brought high-speed internet to more than 6,500 homes in Anchorage, Alaska, and deployment has also started in Perth, Australia, one of the most isolated capital cities in the world.
Bombyx wraps fiber around existing telephone wires, clearing obstacles and flipping as it needs to along its route. (Source: Facebook)
Facebook wants to bring high-speed reliable internet to more than 300M people — but the work doesn’t stop there. Connecting the next billion will require many different approaches. And as people look for more immersive experiences in new virtual spaces like the metaverse, we need to increase access to a more reliable and affordable internet for everyone. The company believes this work is fundamental for creating greater equity where everyone can benefit from the economic, education and social benefits of a digitally connected world.
“High speed, reliable Internet access that connects us to people around the world is something that’s lacking for billions of people around the world,” Mike Schroepfer, Facebook’s chief technology officer, declared during the company’s “Inside the Lab” roundtable discussion. “Business as usual will not solve it. We need radical breakthroughs to provide radical improvements – 10x faster speeds, 10x lower costs.”
Facebook and its partners are in the process of building 150,000 kilometers of subsea cables, and working on new sea-based power stations that will provide those cables with power.
“This will have a major impact on underserved regions of the world, notably in Africa, where our work is set to triple the amount of Internet bandwidth reaching the continent,” Dan Rabinovitsj, Facebook’s VP of connectivity, explained. That activity partly ties into a new segment of subsea cables called 2Africa PEARLS that will connect three continents: Africa, Europe and Asia.
2Africa Pearls, a new segment of subsea cable that connects Africa, Europe and Asia, will bring the total length of the 2Africa cable system to more than 45,000 kilometers, making it the longest subsea cable system ever deployed, the company said.
Cynthia Perret, Facebook’s infrastructure program manager, noted every transatlantic cable Facebook connects will contain 24 fiber pairs. “Capacity alone isn’t enough,” she said, noting that Facebook is also working on ways to configure and adapt the amount of capacity provided to each landing point. Facebook is also utilizing a model called “Atlantis” to help forecast and optimize where subsea cable routes need to be built. An integrated adaptive bandwidth system will likewise allow Facebook to shift capacities based on traffic patterns and reduce congestion and improve reliably, Perret explained.
Facebook Inc. and Africa’s largest fiber optics company, Liquid Intelligent Technologies, are extending their reach on the continent by laying 2,000 kilometers (1,243 miles) of fiber in the Democratic Republic of Congo. The two companies intend to build an extensive long haul and metro fiber network. Apparently, this is part of Facebook’s effort to “connect the unconnected,” especially in 3rd world countries.
The move will make Facebook one of the biggest investors in fiber networks in the region. The cable will eventually extend the reach of 2Africa, a major sub-sea line that’s also been co-developed by Facebook, the two companies said in a July 5th statement.
Facebook will invest in the fiber build and support network planning. Liquid Technologies will own, build and operate the fiber network, and provide wholesale services to mobile network operators and internet service providers. The network will help create a digital corridor from the Atlantic Ocean through the Congo Rainforest, the second largest rainforest after the Amazon, to East Africa, and onto the Indian Ocean. Liquid Technologies has been working on the digital corridor for more than two years, which now reaches Central DRC. This corridor will connect DRC to its neighboring countries including Angola, Congo Brazzaville, Rwanda, Tanzania, Uganda, and Zambia.
The new build will stretch from Central DRC to the Eastern border with Rwanda and extend the reach of 2Africa, a major undersea cable that will land along both the East and West African coasts, and better connect Africa to the Middle East and Europe. Additionally, Liquid will employ more than 5,000 people from local communities to build the fiber network.
“This is one of the most difficult fiber builds ever undertaken, crossing more than 2,000 kilometers of some of the most challenging terrain in the world” said Nic Rudnick, Group CEO of Liquid Intelligent Technologies. “Liquid Technologies and Facebook have a common mission to provide affordable infrastructure to bridge connectivity gaps, and we believe our work together will have a tremendous impact on internet accessibility across the region.”
Liquid Intelligent Technologies is present in more than 20 countries in Africa, with a vision of a digitally connected future that leaves no African behind.
“This fiber build with Liquid Technologies is one of the most exciting projects we have worked on,” said Ibrahima Ba, Director of Network Investments, Emerging Markets at Facebook. “We know that deploying fibre in this region is not easy, but it is a crucial part of extending broadband access to under-connected areas. We look forward to seeing how our fibre build will help increase the availability and improve the affordability of high-quality internet in DRC.”
Facebook has been striving to improve connectivity in Africa to take advantage of a young population and the increasing availability and affordability of smartphones. The social-media giant switched to a predominantly fiber strategy following the failed launch of a satellite to beam signal around the continent in 2016.
About Liquid Intelligent Technologies:
Liquid Intelligent Technologies is a pan-African technology group present in more than 20 countries, mainly in Sub-Saharan Africa. Liquid has firmly established itself as the leading provider of pan-African digital infrastructure with an extensive network covering over 100,000 km. Liquid Intelligent Technologies is redefining network, cloud, and cybersecurity offerings through strategic partnerships with leading global players, innovative business applications, smart cloud services and world-class security on the African continent. Liquid Intelligent Technologies is now a comprehensive, one-stop technology group that provides customized digital solutions to public and private sector companies across the continent under several business units including Liquid Networks, Liquid Cloud and CyberSecurity and Africa Data Centers. For more information contact: Angela Chandy [email protected]
Following last month’s FCC filing to test a small 5G network, Facebook has filed another FCC Special Temporary Authority (STA) petition to test a “converged wireless system” that could potentially support concurrent communications across Wi-Fi and cellular networks in Menlo Park, CA (Facebook corporate headquarters).
In its FCC filing (granted June 23,2021), Facebook said “The experiment involves short-term testing of a LTE over-the-air setup for an indoor demonstration that is not likely to last more than six months, making an STA more appropriate than a conventional experimental license.”
Also, that it is researching a “proof of concept for a converged wireless system that will operate at the 2.4GHz Wi-Fi band and at Band 3 (1710MHz to 2495 MHz). The goal of the proof of concept is to create a demonstration and see if such a system may be viable. The system that will be tested will have a simple radio head that will be able to operate as a Wi-Fi Radio at 2.4 GHz and as a Band 3 cellular radio (LTE) concurrently. We will wirelessly connect dedicated client devices to demonstrate performance.”
The FCC approved Facebook’s request on June 23,2021. It will remain in effect until its scheduled expiration date of November 10, 2021. Facebook petition was filed under the “FCL Tech” name, which the company has been used for previous wireless tests in the 6GHz band.
Facebook will be using five units of unspecified AVX wireless network gear (E 102289 model). AVX is a Kyocera Group company. Their website states:
AVX Corporation is a leading international manufacturer and supplier of advanced electronic components and interconnect, sensor, control and antenna solutions with 33 manufacturing facilities in 16 countries around the world.
We offer a broad range of devices including capacitors, resistors, filters, couplers, sensors, controls, circuit protection devices, connectors and antennas. AVX components can be found in many electronic devices and systems worldwide.
Since WiFi at 2.4 GHz is in unlicensed spectrum (and being used indoors), one would assume that Facebook would also like to operate LTE in unlicensed spectrum in their converged network.
LTE in unlicensed spectrum (LTE-Unlicensed, LTE-U) is a proposed extension of the 4G-LTE wireless standard intended to allow cellular network operators to offload some of their data traffic by accessing the unlicensed 5 GHz frequency band. LTE-Unlicensed is a proposal, originally developed by Qualcomm, for the use of the 4G LTE radio communications technology in unlicensed spectrum, such as the 5 GHz band used by IEEE 802.11a and 802.11ac compliant Wi-Fi equipment. It would serve as an alternative to carrier-owned Wi-Fi hotspots. Currently, there are a number of variants of LTE operation in the unlicensed band, namely LTE-U, License Assisted Access (LAA), and MulteFire.
License Assisted Access (LAA) is a feature of LTE that leverages the unlicensed 5 GHz band in combination with licensed spectrum to increase performance. It uses carrier aggregation in the downlink to combine LTE in unlicensed 5 GHz band with LTE in the licensed band to provide better data rates and a better user experience.
However, Facebook’s STA is only for the band between 1710-2495 MHz – not the 5 GHz band.
The FCC today approved Facebook’s application to test a 5G small cell network across a wide range of mid-band spectrum bands (see below) at its Menlo Park, California headquarters.
The experiment involves short-term testing of a 5G over-the-air setup for an outdoor demonstration that is not likely to last more than six months, making an STA (Special Temporary Authority) more appropriate than a conventional experimental license.
The purpose of operation is to demonstrate the self-organizing network (“SON”) features in a 5G over-the-air setup operating in a small cell configuration. Lab testing does not allow feature realization. The outdoor test setup aims at validating the improvements done to 5G cellular networks.
The improvements involve:
(1) Load balancing between the cells in an attempt to optimize the resource utilization, reduce call drops, and create a better user experience by means of improved quality of service; and
(2) Run time selection and updates of the 5G cell physical layer cell identifiers (“PCIs”) to avoid conflict between neighboring cells, thereby avoiding UE drops and reducing network signaling traffic.
The frequency bands to be used are: 2.496-2.690 GHz, 3.3-3.6 GHz, 3.7-3.8 GHz, and 4.8-4.9GHz. A directional antenna will be used to beam the 5G signals.
Facebook did not name the network equipment suppliers for this test nor did they state why they needed to perform these tests. The only hint given was to test “self-organizing network (“SON”) features in a 5G over-the-air setup operating in a small cell configuration.”
One could speculate that Facebook might want to deploy a private 5G network across its sprawling Menlo Park campus. Or they might want to provide 5G access to municipalities using mid-band spectrum.
The company does have some recent experience designing and deploying millimeter wave wireless distribution networks (based on Terragraph) which could be combined with a 5G access network.
- Facebook’s Terragraph wireless backhaul technology is being used by Cambium Networks in their 60 GHz cnWave solution. Terragraph is a high-bandwidth, low-cost wireless solution to connect cities. Rapidly deployed on street poles or rooftops to create a mmWave wireless distribution network, Terragraph is capable of delivering fiber-like connectivity at a lower cost than fiber, making it ideally suited for applications such as fixed wireless access and Wi-Fi backhaul.
- In June 2018, Magyar Telekom, subsidiary of Deutsche Telekom, deployed their first Terragraph network in Mikebuda, Hungary. Terragraph improved local network speeds from 5M bps to 650M bps.
- Common Networks, a California based Internet Service Provider, deployed a Terragraph network to serve customers in Alameda, CA. Local businesses and customers of Common Networks saw an immediate improvement in internet speeds. Common Networks presented their approach at a 2018 IEEE ComSoc SCV technical meeting in Santa Clara, CA.