Beans.ai, Here use firefighters and location data for mapping - Protocol

2022-07-23 03:15:24 By : Ms. Lily Wang

Fueled by quick-commerce services and heightened delivery expectations, tech companies are building unique data sets and AI-based software to make the final steps toward a destination speedier and more efficient.

To secure the best parking, Salinas firefighters would have to flip through a binder full of printouts en route to an emergency.

When Thomas Melia, an engineer for the City of Salinas Fire Department in California, was dispatched earlier this year on a medical call to an assisted-living facility, his routing app told him where to park the truck. But when he pulled into the sprawling multi-unit complex, the ambulance service that had already arrived was parked near the front door, far from the apartment in need.

“It’s just a super long walk,” Melia said. His team had time to reposition the ambulance near the back door. “We looked and knew this unit is going to be right by the elevator by the back door.”

In the past, in order to make sure they parked in a spot where they could easily access a particular apartment and nearby standpipes, hydrants and sprinklers, the firefighters would have to flip through a “target hazard” binder full of printouts while frantically adjusting masks and toolbelts en route to an emergency.

Now, their fireproof mobile devices tell them not only the best route to take there, but the best spot to park the truck. That means Melia’s captain doesn’t have to flip through a binder as often. “It increases safety by having more eyes on the road,” said Melia. “You can start to come up with a plan before you even get there.”

They’re not using Google Maps, though.

Instead, the firefighters rely on routing technology from Beans.ai, one of many companies building unique data sets and AI-based software that make the final steps to a destination speedier and more efficient. After the pandemic fueled use of quick-commerce services and heightened delivery expectations in general, the routing data and applications devised by companies including Beans.ai, Here and even pizza purveyor Domino’s aren’t just about the last mile. They’re about the last few feet.

Founded over three years ago, Beans.ai was not originally intended strictly as a service for emergency responders. However, it turned out fire departments and government agencies were useful places to look when foraging for data that would help food and package delivery workers find the closest parking spot to a destination.

Not only did the company temporarily borrow the Salinas Fire Department’s binders of maps to input data into its system, but Beans.ai even used freedom-of-information requests — it petitioned 180 municipal agencies such as building inspection and occupational safety departments in all 50 states and 118 cities — in the hopes of retrieving information such as physical maps that show building layouts. In the end, only 31 of those agencies had something to share that the company could use to feed what it calls “ground-ops” data, said Nitin Gupta, its co-founder and CEO.

In fact, most of its data does not come from old-school analog sources. Instead, more than 90% of it has been supplied by people the company pays to submit photos of maps posted inside large apartment complexes, mobile home parks and assisted-living facilities through its 100ft Surveyor app and driver-routing app. To date, Beans said it has mapped about 70% of U.S. apartments that have 40 or more units.

Binders used by the Salinas Fire Department.Photo: Salinas Fire Department

Today, the company uses that information to refine machine-learning models for features in its apps that optimize delivery dispatch, routing and other logistics-related activities. For instance, its technology is integrated with FedEx Ground, allowing delivery contractors that pay $25 per month per driver to upload daily delivery locations in order to generate routing information for their drivers. But Beans.ai also aims to sell to government and public safety agencies through integrations with software used by emergency responders, including Tablet Command and BCS Marvlis.

Hunting for last-foot data could pay off as the rise of quick-commerce delivery services from companies such as Buyk, Gopuff and Gorillas spur competition and development in the delivery-routing tech arena. More-established companies including DoorDash have been gathering delivery logistics data to train models used in their delivery fulfillment apps for years.

Last November, Coresight Research estimated that retail sales in the overall quick-commerce market would total $20 billion to $25 billion in the U.S. in 2021, around 10% of the research company’s share of estimated U.S. online consumer packaged goods sales for the year. New players like Fridge No More, JOKR and 1520 that promise 15-minute deliveries “have intensified instant needs in terms of speed promises,” according to Coresight.

Companies in the quick-commerce sector will need accurate data and tools to get items to people at a rapid speed or they’ll risk losing out on customers who can easily switch to another service, said Christoph Herzig, head of Fleet Applications for Here Technologies, which sells its routing application for delivery drivers to businesses.

“The value of location technology becomes even more important because it’s all about early user conversion,” Herzig said.

But other sorts of companies also are interested in data showing building minutiae. Herzig said Here is working with a package-delivery carrier to gather building data including details about elevators, staircases and mailboxes.

The company uses a combination of automation, machine learning and human analysis to test and refine delivery routing maps created using a variety of data sets such as information about road surfaces and speed limits: Data that is especially relevant to heavy commercial vehicle drivers who can only travel on certain roadways. Here also sells its proprietary data, and lets customers sell their own data sets or build applications using data available in its exchange.

Pizza giant Domino’s has also attempted to mine data to help streamline deliveries. The company’s more than 10,000 delivery drivers use a delivery app that its manager of Data Science Zack Fragoso told Protocol is “a key piece” of its last-mile routing efforts.

The app tracks driver locations while they are en route in order to alert customers when they are nearby. But he said that when the app was first introduced, it needed better data to improve problems such as false notifications telling customers an order was delivered when it was not. In the hopes of achieving higher accuracy, the company added features to help gather more specific location information during deliveries, Fragoso said.

Over the last couple years, Domino’s delivery drivers have given the Domino's Delivery Experience app — which many drivers say their managers demand they install and use on their personal phones — mixed reviews on Reddit. Some have said they found it useful, while others complained that they didn’t like having to enable tracking functions on their phones for work, and worried about battery life and phone data usage.

“The only reason it's there is to make sure we're not off fuckin’ around during deliveries,” wrote a Reddit poster last year.

Domino’s delivery drivers have given the Domino's Delivery Experience app mixed reviews.Photo: Domino's

While helping to make emergency calls and deliveries more efficient, emerging technologies used for delivery routing and fleet management also create ethical questions related to worker privacy and civil rights.

Apps used by workers that merge personal and professional device use are “undermining workers’ basic human right to disconnect” especially “when workers are required to use personal devices that deliver data to employers, which can be used against them,” wrote Wilneida Negrón in a 2021 report for Coworker.org. The report pointed to threats to worker rights that can be exacerbated by data collection and algorithmic technologies such as increased worker monitoring, wage theft and labor-organizing surveillance.

When FedEx drivers or other delivery drivers use Beans.ai apps to assist in routing, they can turn off location tracking while still accessing static routing information, said Gupta, who added that Beans.ai does not sell its data as a separate product.

Still, as Beans.ai pushes for ways to turn its data technologies into valuable services for business customers in the delivery game, the company has added features and capabilities that some drivers might find invasive. For example, the company uses phone accelerator data to determine whether someone is driving, walking or idle, as well as camera footage data from a dashcam provider to help determine whether deliveries happened, if drivers parked where they say they did or if they were driving while using their phones.

That data gets fed into the Beans.ai system to improve its machine-learning models for routing, but is also used for driver safety reports provided to customers to keep track of driver activity. “With the camera footage we’re getting, we’re able to constantly reevaluate the data,” Gupta said, noting that the Beans.ai app only tracks drivers while they’re working. “When the driver clocks out, it automatically clocks out of any type of tracking.”

The additional driver-tracking capabilities were integrated to ensure the company’s services stay relevant to potential customers, such as delivery contractors that are required to monitor driver incidents or behavior, he said. “Our integration with our partners helps streamline the user experience,” Gupta said. “Every other data company we looked at in the space does not have a strong feedback loop on their data.”

Are you keeping up with the latest cloud developments? Get the Enterprise team's newsletter every Monday and Thursday.

Your information will be used in accordance with our Privacy Policy

Thank you for signing up. Please check your inbox to verify your email.

Sorry, something went wrong. Please try again.

A login link has been emailed to you - please check your inbox.

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Climate TRACE is an ambitious attempt at detecting and tracking global greenhouse gas emissions on a granular level, in real time, for the first time.

“Can you monitor everything?” Al Gore asked founding Climate TRACE member Gavin McCormick.

Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.

You can’t reduce carbon emissions if you don’t know where they’re coming from. Or as Al Gore put it in a recent conversation with Protocol, “There’s an old saying in the business world, ‘You can only manage what you can measure.’” And up until now, measuring who was emitting what and when was nearly impossible.

Climate TRACE began as a student project aimed at tracking emissions using AI and satellites. Photo: Planet, with permission and modification by Climate TRACE

The coalition of researchers and tech companies is leaderless and unincorporated, and its data is completely free and public. “We thought that was essential for anyone to trust us,” founding member Gavin McCormick told Protocol.

Climate TRACE’s mission may seem like a tall order, but it’s one that a number of big names in tech believe in. John Doerr has endorsed it, and Gore is a founding member and donor along with partners at his firm, Generation Investment Management. Google.org — Google’s charitable arm — and Eric and Wendy Schmidt’s philanthropic venture Schmidt Futures are also helping to get Climate TRACE’s efforts off the ground.

What became Climate TRACE started out as a student project led by McCormick at the University of California, Berkeley, aimed at tracking emissions from power plants using AI and satellites. “We started it for fun,” McCormick said, in what is a very nerdy definition of a good time. “It was very much by accident. There was no founding vision, no deep-seated belief that what we were doing would save the world.”

But things began to get more serious when the researchers decided to apply for funding through Google.org’s AI Impact Challenge “on a lark,” according to McCormick. They ended up winning, in 2018, and the snowball on what would become Climate TRACE began to roll.

Gore’s office reached out shortly thereafter. McCormick said Gore told him he’d “been waiting for something like this for years,” and asked if it was possible to do more than monitor power plants. “Can you monitor everything?” the former vice president asked.

Gore, for his part, had advocated for using AI to better detect emissions sources as early as his vice presidency. “But the technology was not mature at that time,” he told Protocol.

McCormick told Gore what he wanted still wasn’t possible. After all, the team was just students or, in McCormick’s case, a dropout, as he’d left his Ph.D. program to focus full-time on running the project, now a nonprofit dubbed WattTime. (The NGO now uses the emissions data that Climate TRACE compiles to help companies calculate the least-polluting time for electricity use.)

“But we realized that if we just partnered with enough organizations … between us all, we can each bite off a piece of it,” McCormick said.

Thus, Climate TRACE was born. “TRACE” stands for Tracking Real-Time Atmospheric Carbon Emissions. (“Nailing the acronym is an important step,” Gore joked.) Today it’s a coalition of over 50 organizations, ranging from for-profit businesses to university research labs, that meets twice a week on Zoom to take on the thorny task of detecting and tracking global greenhouse gas emissions in real time. Each member organization is responsible for a different sector, ranging from shipping to oil and gas to mining.

Last fall, the coalition released the world’s first global emissions inventory, which can be broken down by individual sector and country. “That’s the first time that’s ever been done, and the work has continued and intensified,” Gore said.

This October, the coalition plans on releasing the first-ever asset-level inventory, which will show the greenhouse gas emissions from individual power plants, steel mills or cargo ships. Climate TRACE also plans on ranking the 500 biggest sources of greenhouse gas pollution in every subsector of the global economy. That will come, Gore noted, just in time for major international climate talks being held in Egypt. In addition, the coalition is also working on developing APIs that could help investors trying to decarbonize their portfolios, corporate supply chain managers trying to reduce emissions or NGOs that want to better target their campaigns against polluters.

In an era when companies and countries alike are making major commitments to reducing emissions, Climate TRACE’s data could be a revelation. There is currently no truly independent source verifying that countries and companies are actually doing what they’ve promised. In fact, accountability has been seriously lacking in almost every context; the Paris Agreement is nonbinding, and corporate climate plans are, at the end of the day, just nice promises. Absent strong, binding regulations or agreements, an independent body to publicly name and shame polluters is one of the best tools to stop bad actors from frying the planet.

Carbon dioxide is notoriously hard to measure and track via satellite imagery. Satellites can technically detect carbon dioxide in the atmosphere, but you can’t control what’s in its optical path, said Pieter Tans, senior scientist at the National Oceanic and Atmospheric Administration’s Global Monitoring Laboratory. In essence, that means that just because you see carbon dioxide near a power plant, that doesn’t mean the power plant emitted it, because of confounding factors like background pollution, wind and other types of weather.

Absent strong, binding regulations or agreements, an independent body to publicly name and shame polluters is one of the best tools to stop bad actors from frying the planet.

To get around this sticky situation, McCormick said Climate TRACE relies on “anything that we can physically see” to detect pollution and its source. Among these are steam coming out of cooling towers, thermal infrared heat, column-integrated nitrogen oxide (a co-pollutant) and even ripples in lakes near power plants, which can indicate if water is being used for cooling purposes.

Climate TRACE gathers data from carbon dioxide sensors on the ground, the most reliable being the continuous emission monitoring system, which monitors every power plant in the U.S. Those sensors are high-quality, but there aren’t very many of them. Climate TRACE uses them, though, to cross-validate other sources of data that isn't as high-quality or granular, such as satellite data. (Climate TRACE relies on publicly funded satellites like those launched and managed by NASA and the European Space Agency, as well as commercial satellite data that they purchase from places like Planet and GHGSat.) That helps train the coalition’s AI.

In Gore’s view, Climate TRACE’s data will allow everyone interested in solving the climate crisis — businesses, investors, NGOs, governments — to take action that could “quickly and dramatically” reduce emissions. Up until Climate TRACE, “we did not have actionable data,” he said.

Climate TRACE gathers data from carbon dioxide sensors on the ground as well as satellites in space.Photo: Climate TRACE

It’s not a coincidence that the group plans on releasing its next inventory right before the next U.N. climate conference. “All of the present data sources on greenhouse gas emissions, other than Climate TRACE, are derived from a single bottleneck source,” Gore said. And it's self-reported emissions data handed to the U.N., which only developed countries are required to report. Those reports are often five or more years out of date and include “large omissions,” Gore pointed out.

There are “huge biases in accounting procedures,” Tans said, so any efforts to add more accuracy and specificity to country-level reporting would be helpful. (A Washington Post investigation before climate talks last year confirmed this.)

True to its word, Climate TRACE has already found some inconsistencies between reported emissions and what it's detected. One of the biggest discrepancies is, perhaps unsurprisingly, from the oil and gas sector, which McCormick said “is noticeably less honest than all the other sectors.” Emissions from production and refining were approximately double what had been reported to the U.N.

“Again, that’s only for the countries that are required to report their emissions,” Gore stressed. Climate TRACE’s estimates show that more than 1 billion additional tons of emissions have gone uncounted by countries that aren’t required to report them.

Climate TRACE has another advantage. Unlike efforts like Tans’, whose lab sits within the U.S. Department of Commerce, the coalition’s work isn’t subject to the whims of a changing administration.

“I thought this is something the government should do. However, I have doubts,” Tans said, pointing to the fact that President Donald Trump tried to drastically cut funding to the NOAA while he was in office. While Trump didn’t succeed, Tans said “there’s a vulnerability there” for federal agencies’ data.

Despite his background, Gore also strongly believes that independent coalitions like Climate TRACE are better suited to take on this work of emissions tracking than the government. The evidence speaks for itself, he said: “Governments haven’t done it,” partly because it’s hard to do. AI and machine learning powering the efforts is “relatively new” and is only now being harnessed to “precisely identify the emissions in every single sector.”

Gore strongly believes that independent coalitions like Climate TRACE are better suited to take on this work of emissions tracking than the government.

The coalition’s work is far from done. “We’re recruiting every day,” McCormick said. One area it’s still struggling with and looking for additional organizational support for is detecting indoor fossil fuel use and emissions from things like cooktops and hot water heaters, which are understandably hard to track via satellites. Soil carbon is still another source of emissions that are difficult to track from space that the group hopes to suss out.

McCormick recounted a conversation he had with Gore in which the former vice president told him that, as one of the people responsible for everyone waking up to the climate crisis, he thinks it’s gone too far “to the doom side.” Panic is now becoming a bigger risk. When people get in panic mode, they stop looking for solutions.

In McCormick’s view, Climate TRACE is one reason for optimism, because the group’s not just calling out the “bad guys”; it’s also revealing those solutions. Through the coalition’s work, for example, the shipping industry has figured out that slowing down “significantly” cuts emissions, a relatively simple fix.

“I’m by nature a pessimist, and you wouldn’t know it from my face, because I’m just staring at really promising data,” he said.

Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.

Engaging the value chain around emissions reduction will be crucial to success, says PepsiCo.

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.

Tackling emissions and achieving net zero has become a key part of every company’s environmental, social and governance (ESG) goals. “The threat of climate change is very real. It is one of the most important issues we will encounter in our lifetimes,” said Jim Andrew, PepsiCo’s chief sustainability officer. “What used to be hundred-year weather events are now happening every other year. We need to limit our planet’s temperature increase to 1.5 degrees Celsius to avoid even more severe consequences.”

Jim Andrew is Executive Vice President, Chief Sustainability Officer for PepsiCo.

It's a target that companies worldwide are seeking to meet — including many, like PepsiCo, that are part of the Science Based Targets Network, a group of companies, NGOs, consultancies and coalitions aiming to road-test measurement and reporting methodologies that can help maintain a balanced planetary ecosystem.

As part of PepsiCo Positive (pep+), a strategic end-to-end business transformation with sustainability and human capital at the center, PepsiCo reset its climate change targets, doubling the size of the task ahead of it. “We plan to reduce operational Scope 1 and 2 emissions by 75% and our Scope 3 value-chain emissions by 40% by 2030,” said Andrew. “In addition, we pledged to achieve net-zero emissions by 2040, one decade earlier than called for in the Paris Agreement. It’s not an easy task, but it is important to do for the future of the planet — and one we take seriously as one of the world’s biggest food and beverage companies.” To achieve those goals, PepsiCo has built out a climate action plan that includes scaling regenerative agriculture across the land it sources ingredients from, reducing virgin material usage in packaging and shifting to renewable electricity and fuels.

Alongside the environmental and social benefits of pep+, there are also business benefits. “We are building resilience for the future,” shared Andrew. “Our consumers are watching and factoring sustainability into the choices they make. This is an opportunity for growth and to create value for our shareholders.”

But for many companies on the path to net zero, it’s not always a straightforward journey, particularly when much of the emissions that make up a full greenhouse gas footprint can emanate from outside the four walls of your own manufacturing operations, like in the case of PepsiCo, where 93% of emissions come from its value chain.

This means companies need to be thinking differently and finding ways to support and encourage those along the value chain to also reduce.

“In order for PepsiCo to achieve our net-zero goals, we can’t underestimate the importance of our value chain embracing and implementing science-based goals of their own,” said Andrew. “Our science-based target covers everything from the farmers we rely on at the start of our value chain to our packaging manufacturers.” PepsiCo is asking all partners along the value chain — from suppliers to manufacturers and franchise bottlers — to set their own science-based targets. “It’s the biggest challenge in our journey to net zero. We know this is not going to be an overnight change. We are putting in place the levers and supports now that will have an impact in the future. We don’t have all the answers yet, but we’re making sure that what we do know, we’re sharing to maximize our impact,” said Andrew.

Asking suppliers and associated companies to overhaul the way they work is no small feat, but PepsiCo is taking a three-pronged approach centered around the principles of educating, enabling and incentivizing. An external-facing program, the Sustainability Action Center, aims to engage and equip value chain partners with tools to undergo their own sustainability journey. “We provide a quick assessment to determine their climate maturity level, and those results will then direct them to targeted resources appropriate to their level,” said Andrew. “It’s designed to avoid information overload and to encourage step-by-step action.” A global summit also educated partners on PepsiCo’s pep+ goals and best practices and a new Positive Agriculture playbook has been openly shared with all agricultural suppliers to provide guidance on how to implement regenerative practices on farms, which will deliver an overall reduction in greenhouse gas emissions.

But education is only half the task: To enable partners to make meaningful changes, PepsiCo announced a new initiative: pep+ RENew. “It’s a first-of-its-kind collaboration with Schneider Electric, to provide value chain partners with easy access to renewable electricity and speed their transition to renewable energy through PPAs and other options,” said Andrew.

And incentivizing companies to make a change is vital, too. “We know we’re making a big ask of those we work with, and that it’s not easy,” said Andrew. “We can use our size to send demand signals with others, and also our purchase decisions are meaningful.” In 2021, PepsiCo joined the EPA’s Green Power Partnership, ranking in the top 15 companies nationally in the last two quarters. “We are also members of the Clean Energy Demand Initiative of the State Department, which seeks to build RE demand in developing countries, with the first being Vietnam,” said Andrew.

Recognizing that innovation comes from small startups as well as big corporations, the company has also developed PepsiCo Labs. “It’s a place where we can pilot and scale new innovations,” said Andrew. More than 2,000 startups that could positively impact all parts of the PepsiCo business have been explored, with 150 pilots across 70 countries being put into action. “We’ve taken more than 25 of those ideas and supported them, actively scaling them to become businesses,” said Andrew.

It’s all part of making sure that the company leaves the planet in a better place than it was found. “Companies have to engage their value chain to be impactful in reaching their net-zero goals,” said Andrew. “It’s not possible to do it alone, no matter how big you are. You have to bring everyone else along with you to reach those targets and to ensure that we’re helping the planet.”

Read more about PepsiCo’s progress towards its pep+ ambition here: https://www.pepsico.com/our-impact/sustainability/2021-esg-summary/

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.

Don’t know what to do this weekend? We’ve got you covered.

Our recommendations for your weekend.

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

This week we’re herding cats, stressing out over “The Rehearsal” and “The Bear,” and getting the backstory on #ReleaseTheSnyderCut.

Nothing can quite prepare you for “The Rehearsal,” comedian Nathan Fielder’s follow-up to his Comedy Central series “Nathan For You.” While it follows the same broad strokes of what Fielder fans have taken to calling reality comedy in the vein of Sacha Baron Cohen, “The Rehearsal” reaches unprecedented, profound heights by asking difficult questions about social anxiety and the lengths to which humans will go to avoid feeling emotional pain. It is equal parts deranged dark comedy and alarmingly cathartic reality TV, with a dash of Charlie Kaufman’s “Synecdoche, New York.”

The show, which aired its pilot last week, is not for everybody; like his past work, cringe-sensitive viewers may find themselves literally sick to their stomachs by Fielder’s willingness to push situations to the extreme. But “The Rehearsal” is worth the discomfort for the sheer range of emotions it will yank out of you before you’ve even recognized the magic trick it’s pulling off before your eyes.

In case “The Rehearsal” wasn’t stressful enough, Christopher Storer’s drama “The Bear” will shave a year or two off your life. The dramedy about a struggling sandwich shop in Chicago newly helmed by a former professional chef can be so intense in its camera work, lightning-fast dialogue and realistic portrayal of toxic restaurant work environments that real-life chefs have admitted to not making it through a single episode because of how close it hits to home. But the eight-part series, now renewed for a second season, is so fresh, raw and well-acted that it is impossible not to recommend — it’s no wonder “The Bear” currently holds a 100% rating on Rotten Tomatoes. Just prepare your heart rate in advance.

There’s not much you really need to say about Stray, a new adventure game from French developer BlueTwelve Studio and Annapurna Interactive. The main character is a cat, and anyone who's seen the trailers should know why that makes it an instant must-play for pretty much everyone. Also, it does an excellent job of moving past its premise to tell a touching and adorable narrative about resilience and survival while also featuring some beautiful environments and clever puzzles.

Stray keeps itself interesting through its non-human robot characters, telling the story of a ruined world while also centering its emotional journey on the “Homeward Bound”-like adventure of its titular feline. The game is available for the time being as part of Sony’s PlayStation Plus Extra/Premium subscriptions this month if you don’t feel like purchasing it outright.

British studio Supermassive Games’ latest release is a fantastic entry in the complicated and often messy market for games that are more like interactive movies. Like its past title Until Dawn and many of the Hollywood-aspiring releases from developers like Quantic Dream, The Quarry’s interactivity mostly centers on making pivotal choices about the fate of its characters.

But the freedom it gives you and its schlocky B-movie horror influences make it a perfect game for people who don’t play a lot of games, with plenty to love if you’re a fan of slasher flicks and monster movies. With its twisting narrative and strong replayability to unlock different endings and uncover more secrets, The Quarry succeeds as arguably the best version yet of this particular take on video game narrative.

Remember #ReleaseTheSnyderCut? The social media movement that led Warner Bros. to release a second “Justice League” version on HBO Max was unprecedented — and likely driven by bots tied to a now-defunct Los Angeles ad agency, according to an internal investigation conducted by the studio that Rolling Stone recently got its hands on. Some of Zack Snyder’s real fans took things even further, harassing studio executives and journalists alike. The craziest part of the story, however, is Snyder’s own role in all of this, which allegedly involved taking hard drives from the studio lot and reshooting scenes in his own backyard.

A version of this story also appeared in today’s Entertainment newsletter; subscribe here.

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Microsoft’s Power Platform strategy includes low-code and no-code software development tools that allow non-technical people to build applications for their team’s needs.

Microsoft corporate Vice President Charles Lamanna told Protocol about low-code/no-code tools.

Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She's based in Los Angeles and can be reached at acounts@protocol.com.

Despite the fact that enterprise tech is still trying to figure out exactly where it fits, Charles Lamanna is all in on low-code software development. “I think there's probably no more perfect manifestation of the Microsoft mission statement of ‘empower everyone and every organization in the world to do more’ than low code,” he said.

Lamanna, whose career at Microsoft has been marked by a swift ascension to the upper executive ranks, began his journey with the company in 2009 by helping Microsoft shift one of its most storied products — Office — to the cloud with Office 365. But an itch to build new products and services and a curiosity about the public cloud led him to co-found cost-management startup MetricsHub in 2012. Within six months, Microsoft acquired the company, and the capabilities Lamanna developed laid the foundation for what would become Azure Resource Manager and Azure Monitor, among other services.

In another moment of prescience a few years later, Lamanna and a small team of developers entered a Microsoft hackathon where they built a product called Wolf Crow, a low-code/no-code automation and integration tool that would later become Azure Logic Apps, then Microsoft Flow, then Power Apps and eventually Power Platform, a group of business software tools that he now oversees along with Microsoft’s Dynamics 365 applications.

Now, as corporate vice president of Microsoft Business Applications, Lamanna has a broad remit at the center of “one of the fastest-growing Microsoft businesses at scale.” That business is Power Platform, which according to Lamanna has more than 7 million monthly active users, more than $2 billion in revenue and is growing at an astonishing 72% year-over-year.

In a conversation with Protocol, Lamanna talked more about why Power Platform matters to Microsoft, how it plays with Dynamics 365, Teams and the broader Microsoft suite and why productivity and collaboration are key to Microsoft’s low-code future.

The following interview has been edited and condensed for clarity.

How many people are using Power Platform?

There’s a few stats we've shared to size it: There’s over 7 million monthly active citizen professional developers, which is pretty astonishingly big if you go compare it to a programming language or something. And the majority of those are business users, non-coders. [Microsoft CEO Satya Nadella] shared, I think a quarter or two ago, that we crossed $2 billion in revenue in the trailing 12 months, which is growing 72% year-over-year, one of the fastest-growing Microsoft businesses at scale. The dollar piece … shows there's value. Because that's the other thing — lots of people use it, and they get a lot of value. 97% of the Fortune 500 uses the Power Platform, 92% of the Fortune 500 uses Power Apps in at least one department. It’s not like air, I don't see it literally everywhere, but you run into it all the time.

What factors do you see driving low-code adoption?

I consider myself a technology historian to a degree. There's a great article — I think it was by Joel Spolsky — once that talked about what spreadsheets did for data entry. Forty years ago, there were professional spreadsheet users and data-entry users because you didn't have a personal computer or you didn't have spreadsheets. There were several hundred thousand people with that job in the U.S., and then all of a sudden, you get the personal computer and the spreadsheet and a lot of the work they used to do, people just do themselves, because I can just open Excel and put the data in.

But what's interesting is these professions evolved to higher-end, more sophisticated solutions where they got to use more mathematical creativity and statistical creativity instead of just [doing] the simpler tasks. So the analogy I would draw is what I think low-code will do today.

Today, if this is the total number of solutions that the enterprises should build, say there's 500 million of them: They can only afford to build 100 million at the current cost per solution that gets built. So 400 million apps have demand that goes unmet. Those 400 million can now be picked up and built with low-code tools by the business users themselves, and by non-coding but technically proficient people. Think of an IT admin who maybe can’t deploy to a Kubernetes cluster but definitely could go build a Power Platform solution. So they can go attack those 400 million. And the 100 million, a portion of those will also now be built with low-code to go faster and be more affordable.

I kind of see it as a broad spectrum, and one of the core theses that we have for Power Platform is that it has to be a platform that works for all three types of users: business users, so the citizen developers; IT professionals, so non-developers but technically proficient folks; and professional developers. All three have to be able to work on one platform. They don't all use the same user interface, they don't all use the same programming model, but all three have to be able to work on one platform. Otherwise, you can't really enable all 500 million apps, because you need all three to start using these tools and to use them together in concepts like Fusion Teams. Otherwise, you'll never be able to go tackle it broadly.

The best low-code tool ever is Excel.

Our view is, we have slang for it: no-code, low-code and pro-code. All are welcome, is what we say. No-code for business users, low-code for IT pros and pro-code for the pro devs. And we really focus on making that possible. And that's a hard thing from a technology and a user experience thing, that is the big challenge. How do you make it that capable but that understandable; that powerful, but that easy to get started?

That makes sense. [Low-code/no-code] is really just a spectrum, and there are trade-offs at each stage. You get more power here, but then it's harder to use, and maybe less power here, but then it's easier to use [and] faster to get started.

The best low-code tool ever is Excel. I can open Excel and I can make a list of stuff and add stuff up with no training. Then you have people who I swear basically get Ph.D.s in Excel doing super complex [net present value] derivative modeling, unbelievable things in Excel. That's all one platform. The magic is that it can be one platform. And there's probably a whole other background as to the magic of platforms where you can reuse it for many use cases because you get amazing skill and leverage, but I think it's the same type of approach.

How does Power Platform play across the broader Microsoft business software suite, going into Dynamics 365 and Office?

We like to think that Office and Dynamics kind of rest on top of Power Platform, and what we mean by that is, if you want to do extensibility and customization in Office or Dynamics, you turn to Power Platform. So if you go to a SharePoint list, and you want to add a workflow on your SharePoint list, that's actually embedded Power Automate. Or if I go to SharePoint and want to embed an app on my SharePoint site, that’s going to be through Power Apps. Or if I'm inside of Microsoft Teams, and I wanted to build a custom workflow, that's Power Automate. Or if I want to build a dashboard and report, that’s Power BI. Or if I want to build a chat bot in Teams, that’s Power Virtual Agent. And we have really seamless and easy integration. The same thing is true [for Dynamics], Dynamics is literally built on top of Power Platform, so if you want to go change a form or a data schema or some logic or workflow in Dynamics, you just end up inside the Power Platform experience.

We have this belief that over time, every enterprise software solution will need to have extensibility through low-code [tools], and you start to see that: Every company advertises low-code/no-code customization and configuration now. And one of the things we announced at [Microsoft Build] was basically the Power Automate embed capability. So other software companies can even embed Power Platform inside their solutions, their SaaS offerings, without having to go build their own low-code platform.

Do you see productivity and collaboration tools as a key part of this low-code/no-code movement?

I think one of the most interesting opportunities is that low-code lets you build a lot of things, you get a lot of apps. We have customers with tens of thousands of apps and tens of thousands of Power Automate workflows and tens of thousands of Power BI dashboards and a thousand Power Page websites, so you get a lot of solutions.

And one of the biggest challenges is: How do you make it so that your users can discover them? Having communication, collaboration and low-code working well together makes it so you can actually embed a Power App in Microsoft Teams. You can, for example, pin a Power App in Teams in a Team’s channel, or in the personal tab on the left-hand side, and a lot of our users and customers do that because, that way, it doesn't feel like I'm going to a different app or a different website: It actually feels like it's part of Teams. That type of discovery — and all that discovery is open so anyone can go use it at any company, it's not just a Microsoft-only extensibility model — but that type of integration makes it so easy for the solutions which are built so easily to also be used so easily. So I think that's a major component.

And then one of the biggest things that we do is, we don't make you learn: If you already know Office, you already know Power Platform — that's kind of our overarching thing. We work with the Office team and the Power Platform team to share design patterns, to share [user interface] components, to share user experience or interaction models, things like the formula language — it really is an extension of the Excel formula language. We do those types of things because we already trained a billion people on Office; let's have their skills feel familiar when they end up in the Power Platform.

Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She's based in Los Angeles and can be reached at acounts@protocol.com.

Andy Jassy’s got 99 antitrust problems, but One Medical probably won’t be one.

If the agency does dig into the One Medical deal at all, there’s a good chance it would start with consumer data concerns.

Kate Cox ( @KCoxDC) edits Protocol's policy coverage. Before joining Protocol, she covered tech policy and culture as a reporter for Ars Technica. Prior to Ars, Kate reported on mergers and antitrust matters at CQ Roll Call; tech policy, privacy and other consumer issues at Consumerist; and video games basically anywhere that would allow it. She is originally from Boston but settled in the Washington, D.C., area in 2008, where she lives with her husband, kids and cat.

Amazon is once again in the headlines with a high-value acquisition: This time, it’s reaching a tentacle into the health care space by acquiring boutique primary care provider One Medical for roughly $3.9 billion in cash.

The planned acquisition creates many questions. Some, like “Why health care?” are fairly self-explanatory. Annual U.S. health care spending is measured in trillions of dollars, so it’s no surprise when a business wants a bigger slice of that pie. Amazon has been making direct inroads into health care since acquiring mail-order pharmacy PillPack in 2018, which have only grown as it launched products such as Amazon Pharmacy.

Other questions are slightly more complicated, such as “Is Amazon going to be allowed to do that?” The short answer is almost certainly going to be “yes.” And while that definitely feels weird to some, the long answer has far more to do with how U.S. law handles competition than it does with Amazon.

It sure is! The FTC has been probing Amazon since June 2019 at least, and the investigations are reportedly still going strong.

Amazon is also in trouble with Congress. A House committee probing Big Tech issued a blockbuster report in late 2020 accusing Amazon of abusing its “monopoly power,” and saying the company should be split. (Apple, Alphabet and the company now known as Meta also got dinged in that report.) The House Judiciary Committee also recently asked the Justice Department to probe if Amazon had been truthful during the course of the investigation leading to that report.

One of the House staffers who authored that report was Lina Khan, who is now pushing an aggressive antitrust enforcement agenda as the FTC’s chair. Although Amazon is not the only company on the agency’s radar by a long shot, Khan came to national prominence with a 2017 Yale Law Journal article arguing for stronger antitrust enforcement against Amazon. Although the FTC hasn’t filed a suit against Amazon yet, most observers expect the commission to tackle Amazon’s retail practices. The European Commission is likewise investigating if Amazon’s behavior with third-party vendors breaks competition law in the EU.

Amazon’s AWS business, which controls at least a third of the cloud services market, is also reportedly under antitrust scrutiny, and there are lawsuits underway in both Washington state and D.C. alleging Amazon’s behavior around its third-party marketplace vendors violates antitrust law.

Amazon definitely has to file paperwork about the deal with the FTC, there’s no way around that. Companies planning a merger or acquisition above a certain threshold — $101 million for 2022 — must file their plans with the FTC before completing the transaction. That filing kicks off a 30-day waiting period that regulators can resolve one of three ways: They can grant early termination, meaning they have no issues with the transaction; they can file a second request for information, kicking off an actual probe; or they can do nothing at all, and simply let the waiting period expire.

That “second request” is basically where a formality turns into an investigation. There’s no guarantee the FTC would open one, but for a transaction that both involves Amazon and is valued at more than $1 billion, there’s definitely a non-zero chance it would.

We saw something similar last year when Amazon acquired MGM Studios. The transaction kicked off a flurry of investigation and calls for stronger antitrust enforcement — but when you get right down to it, “film studio” is not Amazon’s core business, nor is it anything like the largest player in the space, and so regulators worldwide more or less said, “ugh, OK, fine,” and Amazon moved forward as if the FTC had, too.

It’s worth noting, though, that even if the waiting period closes uneventfully, that doesn’t mean the FTC is done with Amazon. Even though the MGM transaction closed uneventfully in the spring, media reports from late May and early June suggest the commission is still deeply probing the transaction.

Historically, U.S. antitrust regulators are concerned with intense concentration in a specific, tightly defined market. As an example, let’s say four companies manufacture widgets, and each of those companies has 25% market share. If Widgets Inc. purchases two of its competitors and achieves 75% market share, it gains outsized power and can distort the market around it. The remaining company, holding 25% market share, will have a harder time competing and new competitors will be extremely hard to spin up.

But Amazon, despite its prior efforts in health care, is not a major competitor in the present landscape. Even an extremely determined Amazon foe with an axe to grind would have a hard time making a case that Amazon is dominant in or controls the U.S. health care market, because it doesn’t.

What Amazon does do is put tentacles out into a hundred market segments at once: It’s the biggest U.S. cloud services provider. It’s the biggest U.S. ecommerce site. It’s an Emmy- and Oscar-nominated streaming video service. It’s a surprisingly competitive music streaming service. It’s a high-end supermarket chain. It’s a shipping and logistics company. It’s even a pharmacy.

In short, if antitrust enforcement is usually concerned with the whales that move through the world hoovering up all the food (i.e. smaller companies) in their sight, Amazon is in comparison a giant squid, extending its reach farther away and grasping tightly. And while that might still be deadly if you’re metaphorically swimming nearby, there just isn’t as much of a playbook around for dealing with it.

Where regulators are starting to get concerned, though, is the way Amazon ties its various offerings together. Sometimes bundles can be hugely beneficial to consumers, but depending on how they’re shaped and enforced, they can also be harmful to would-be competitors. In the retail space, for example, the Fulfillment by Amazon program allows shoppers to buy from several disparate third-party merchants easily and at once, but merchants complain Amazon’s terms harm their business financially.

One Medical’s model is membership-based: For a fee of about $200 per year, participants get access to the entire One Medical system and all its features. Basically, it’s streamlined medical care as a service. That business model may sound awfully familiar to Amazon’s more than 160 million U.S. Prime subscribers, who pay $139 per year for access to the full array of Amazon goods and free shipping on those goods.

But when the Prime parent acquires a company that already has a fan base, things change. Amazon has “Amazonified” Whole Foods since acquiring the grocer, for example, and now ties some of its products and benefits to a Prime subscription. Whole Foods customers have not always been happy with the way those services have been integrated, and neither have employees. Some vendors have also had a harder time getting their products onto store shelves and out to consumers than before the acquisition, thanks to Amazon changing Whole Foods’ purchasing structure and leveraging its size to make vendors charge less for their products and cover more costs.

Perhaps more critically, Amazon also ties all the data from its various services and offerings together. And while grocery data is personal, a preference for organic avocados or fresh chocolate chip cookies is not anywhere near as sensitive as the kind of data a primary care practice has available about its patients. While some of that information is explicitly covered by HIPAA, other health care data that technology companies can glean from individuals is not. Several One Medical members immediately aired their concerns about data on Twitter as soon as the transaction was announced.

Kate Cox ( @KCoxDC) edits Protocol's policy coverage. Before joining Protocol, she covered tech policy and culture as a reporter for Ars Technica. Prior to Ars, Kate reported on mergers and antitrust matters at CQ Roll Call; tech policy, privacy and other consumer issues at Consumerist; and video games basically anywhere that would allow it. She is originally from Boston but settled in the Washington, D.C., area in 2008, where she lives with her husband, kids and cat.

To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.