How would you react if you had rented out your home for a few nights, but upon your return, you discovered it had been utilized for a wild and noisy house gathering?
Imagine returning to your rented home, shocked at the property’s damage, and you would likely feel a mix of anger and sadness.
Such incidents have been widely documented worldwide in recent years, particularly during the COVID-19 pandemic. With bars and nightclubs closed, young adults sought alternative places to socialize, dance, and potentially indulge in excess drinking.
In response to this, the short-term rental giant Airbnb launched a “global party ban” and pledged to take all possible measures to prevent such disruptive behavior. This involved barring offenders from making new bookings and imposing restrictions on those under 25 who lacked a positive review history.
Airbnb recently reported a 55% decrease in reported parties between 2020 and the previous year, thanks to these efforts. However, as the battle continues, the US company has intensified its efforts by introducing an AI-powered software system to identify potential troublemakers.
Now operational on a global scale, when you attempt to book an Airbnb property, the AI system automatically scrutinizes factors like the account’s creation date and raises concerns if you’re attempting to rent a property in the same city or town where you reside. It also considers the duration of your stay, with a single night stay raising potential concerns, and it pays attention to whether your planned visit coincides with festive periods like Halloween or New Year’s Eve.
Naba Banerjee, who serves as the Head of Safety and Trust at Airbnb, effectively prevents a significant number of house parties.
Naba Banerjee, the Head of Safety and Trust at Airbnb, explains that the AI system is designed to identify potential party bookings. For instance, if someone attempts to book a room for one night during New Year’s Eve and is from the same city as the host, it’s often indicative of a party intent.
Ms. Banerjee further states that when the AI determines a high risk of a party booking, it will either prevent the booking or redirect the person to the website of one of Airbnb’s partner hotel companies. This approach is intended to build and maintain trust, ensuring that hosts who rent out their homes through Airbnb can feel as secure as possible.
Lucy Paterson, an Airbnb host in Worcestershire, shares her experience. She rents out a one-bedroom annex beside her home and has had over 150 bookings since she first listed her apartment. She mentions that her strategy to minimize the potential for parties was having a one-bedroom place. While it hasn’t always been flawless, she believes that about 99% of her guests have been excellent. She also appreciates Airbnb’s use of AI, which provides her with added reassurance.
Looking ahead, Ms. Banerjee predicts that the AI system will continue to improve as it processes more data, ultimately enhancing its learning capabilities.
Lucy Paterson expresses her confidence in Airbnb’s utilization of AI to identify and deter potential partygoers.
In the car sharing industry, a prominent online marketplace, Turo, utilizes an AI system to enhance the safety of people renting out their cars. The software, known as DataRobot AI, swiftly identifies the risk of theft and sets prices for cars based on factors such as their size, power, speed, and the desired rental start time, either by the day or week.
Additionally, Turo employs AI to enable some users to interact with its app using voice commands, specifying the type of car they need and when they need it. The AI responds with text on the screen, presenting a personalized list of recommended vehicles that meet the user’s criteria. This feature is currently accessible to subscribers of the widely-used consumer AI system, ChatGPT-4, which is integrated into Turo’s platform.
Turo’s Chief Data Officer, Albert Mangahas, emphasizes the company’s aim to simplify the Turo experience, fostering trust between the company and its customers.
Edward McFowland III, an assistant professor of technology and operations management at Harvard Business School, expresses the view that employing AI to screen potential problematic customers is a beneficial strategy. He believes that having an AI layer can reduce friction for both businesses and consumers.
However, he highlights an important caveat. Even a flawlessly tuned AI model may generate false negatives, potentially excluding individuals like a young person seeking to rent an apartment for New Year’s Eve without any intention of hosting a party. This underscores the ongoing challenge of achieving consistent accuracy in AI technology.
Lara Bozabalian does not accommodate individuals who are renting for the first time.
In Toronto, Canada, Lara Bozabalian utilizes Airbnb to rent her family’s 3,000 square foot (280 square meter) cottage. Regardless of how appealing a booking may appear or what the AI recommends, she adheres to her own policy.
She states, “I don’t accept bookings from first-time users. I prefer to have some level of verification from a previous host.”