Liapes v. Facebook, Inc. ( 2023 )


Menu:
  • Filed 9/21/22
    CERTIFIED FOR PUBLICATION
    IN THE COURT OF APPEAL OF THE STATE OF CALIFORNIA
    FIRST APPELLATE DISTRICT
    DIVISION THREE
    SAMANTHA LIAPES et al.,                      A164880
    Plaintiffs and Appellants,
    v.                                           (San Mateo County Super. Ct.
    FACEBOOK, INC.,                              No. 20CIV01712)
    Defendant and Respondent.
    Samantha Liapes filed a class action against Facebook, Inc. (Facebook,
    now known as Meta Platforms, Inc.), alleging it does not provide women and
    older people equal access to insurance ads on its online platform in violation
    of the Unruh Civil Rights Act and Civil Code section 51.5 — both of which
    prohibit businesses from discriminating against people with protected
    characteristics, such as gender and age. (Civ. Code, §§ 51, 51.5, 52, subd. (a),
    undesignated statutory references are to this code.) 1 Liapes alleged Facebook
    requires all advertisers to choose the age and gender of its users who will
    receive ads, and companies offering insurance products routinely tell it to not
    send their ads to women or older people. She further alleged Facebook’s ad-
    delivery algorithm, the system that determines which users will receive ads,
    1 Some courts have used “the Unruh Act” to refer collectively to sections
    51 and 52. (E.g., Munson v. Del Taco, Inc. (2009) 
    46 Cal.4th 661
    , 667–668.)
    Section 51, however, indicates that statute “shall be known, and may be
    cited, as the Unruh Civil Rights Act.” (Id., subd. (a).) We use Unruh Civil
    Rights Act accordingly.
    1
    discriminates against women and older people by relying heavily on the two
    key data points of age and gender. As a result, Liapes alleged, women and
    older people were excluded from receiving insurance ads.
    The trial court sustained Facebook’s demurrer, deciding Liapes did not
    plead sufficient facts to support her discrimination claims. It concluded
    Facebook’s tools are neutral on their face and simply have a disproportionate
    impact on a protected class, rather than intentionally discriminating. The
    court further concluded Facebook was immune under section 230 of the
    Communications Decency Act of 1996 (
    47 U.S.C. § 230
     (section 230)), which
    applies to interactive computer service providers acting as a “publisher or
    speaker” of content provided by others. Liapes appealed. We review de novo
    the ruling on the demurrer. (Regents of University of California v. Superior
    Court (2013) 
    220 Cal.App.4th 549
    , 558 (Regents).) Liberally construing the
    complaint and drawing all reasonable inferences in favor of Liapes’s claims,
    we conclude the complaint alleges facts sufficient to state a cause of action
    and reverse. (Ibid.)
    BACKGROUND 2
    Facebook is a popular social networking service with over two billion
    users every month. As a condition to joining, users must provide it with their
    birth dates and gender. Users cannot opt out of disclosing this information.
    Users engage with Facebook in various ways, including through its “ ‘News
    Feed,’ ” “ ‘Stories,’ ” “ ‘Marketplace,’ ” and “ ‘Watch.’ ” Companies use it to
    send ads, such as for insurance products and services, to consumers. They
    pay Facebook to place their ads on users’ News Feeds.
    Facebook provides advertisers with several tools to determine who
    receives ads. One is “Audience Selection,” allowing advertisers “to specify the
    2 All facts are taken from Liapes’s complaint.
    2
    parameters of the target audience of Facebook users who will be eligible to
    receive the advertisement.” There are thousands of categories advertisers
    may select or exclude, such as interests and behaviors, when setting the
    audience. But advertisers are required to make three selections establishing
    basic target audience parameters: age, gender, and location. Each of these
    three categories has a drop-down menu indicating advertisers can include or
    exclude users by age or gender. The default setting is 18 to 65 years and
    older and all genders, meaning all users 18 years old or older would receive
    the ad.
    Facebook, however, counsels against the broad default audience
    parameters. In “Facebook Blueprint,” a training program for advertisers,
    Facebook strongly encourages them to narrow the age range and genders of
    users who will receive ads to make them more effective. It suggests, for
    example, “ ‘Let’s start with gender. If you want, you can choose to reach out
    to only men or only women. If you have a bridal dress shop, women might be
    a better audience for you. But if you have a shaving and beard grooming
    business, maybe you’ll want to reach out to men.’ ” Other tips include
    considering one’s customer base: “ ‘[t]hink about what [your customers] like,
    how old they are and the interests they have. This can help you identify
    audience options that will help you reach people like them on Facebook.’ ”
    Thus, if “ ‘the majority of your current customers are women, it might be a
    good idea to set your audience to reach women and exclude men.’ ”
    Once the audience is selected, the advertiser determines the ad content
    and the Facebook page or other web page on which the ad will link. The
    advertiser purchases impressions — an event that occurs every time a user is
    shown an ad on Facebook — or clicks — an event that occurs every time a
    user clicks on an ad. Facebook then sends the ad to users within the
    3
    Audience Selection parameters. Users who are not within the selected
    audience will not receive the ad.
    Facebook also allows advertisers to target their ads through a
    “Lookalike Audiences” tool. Advertisers provide Facebook with a list of users
    “whom they believe are the type of customers they want to reach.” Facebook
    then applies its own analysis and algorithm to identify a larger audience
    resembling the sample audience. The resulting audience will be eligible to
    receive the ad. Facebook expressly uses age and gender to directly determine
    which users will be included in a Lookalike Audience. Thus, if an advertiser
    creates a sample audience that is disproportionately male or younger, the
    Lookalike Audience will disproportionately exclude women and older people.
    Once the audience has been selected, Facebook thereafter uses an ad-
    delivery algorithm to further determine which users within a particular
    audience will receive ads. “For example, if an advertiser chooses an audience
    selection of 500,000 but purchases only 50,000 impressions to be sent to
    Facebook users within that audience selection, Facebook must determine
    which of the 500,000 Facebook users will actually receive the advertisement.”
    The algorithm uses a variety of data points, such as data about each user and
    past and ongoing performance of certain types of ads to determine which
    users will receive the ad. In doing so, the algorithm relies heavily on age and
    gender to determine which users will actually receive the ad, regardless of
    whether the advertiser directs Facebook to limit its Audience Selection based
    on those factors.
    One research study of Facebook’s ad platform “ ‘observe[d] significant
    skew in delivery along gender . . . despite neutral targeting parameters.’ ”
    This bias, the researchers concluded, was the result of the platform — not the
    advertisers — making choices about which users to show the ads. Another
    4
    study auditing over 100,000 ads published on Facebook determined credit ads
    were more likely to be sent to a larger share of men than women.
    Liapes is a 48-year-old woman and regular Facebook user. She was
    interested in learning about insurance products via ads on her News Feed
    because she did not have life insurance at that time. But she could not view
    several life insurance ads posted on Facebook due to her age or gender; had
    she been able to view the ads, she would have qualified for the insurance,
    applied for a quote, and possibly obtain a policy. For example, a life
    insurance ad by Ladder was only sent to people age 25 to 45. She did not see
    a Health IQ Special Rate Insurance ad since it was only sent to males ages 30
    to 64. Similarly, she did not see a National Family Assurance ad because it
    was only sent to males ages 30 to 49. In addition, she did not see four ads for
    auto insurance or four ads for services comparing auto insurance rates in her
    News Feed. As a result, she had a harder time learning about those products
    or services.
    In 2020, Liapes filed a class action alleging Facebook violated the
    Unruh Civil Rights Act by engaging in age and gender discrimination when
    providing users with ads regarding insurance opportunities. 3 She alleged she
    and class members were harmed by being segregated, classified, and treated
    in an unequal, stereotypical, and arbitrary manner, and being denied
    information they have a right to receive on an equal basis because of their
    3 This is Liapes’s first amended complaint.Her original complaint
    alleged Facebook’s Audience Selection tools and delivery algorithm routinely
    and systematically excluded older persons and women from viewing
    thousands of ads regarding financial services opportunities. Facebook
    demurred and moved to stay the case in favor of a separate federal case filed
    by Liapes’s counsel asserting the same claims. After the federal case was
    dismissed, Liapes filed the amended complaint.
    5
    age and/or gender. 4 In addition, Liapes alleged Facebook aided, abetted,
    and/or incited numerous insurance companies to publish the ads in a way
    that denied older persons and/or women full and equal accommodations,
    advantages, facilities, and services of their business establishments. (§ 51,
    subd. (b).) Based on the same allegations, Liapes further asserted Facebook
    violated section 51.5 by intentionally discriminating against, boycotting,
    and/or refusing to provide services to women and older people based on their
    age and gender.
    The trial court sustained Facebook’s demurrer. It determined Liapes
    failed to allege Facebook engaged in intentional discrimination because the
    default setting for the Audience Selection tool and Lookalike Audience is age
    and gender neutral. The court disregarded Liapes’s allegations that the ad-
    delivery algorithm expressly discriminated on the basis of age and gender to
    increase the likelihood users would click on each ad and thus increase
    Facebook’s revenue. The court explained these allegations were inconsistent
    with those in the original complaint — that the purpose of the algorithm was
    “to optimize an advertisement’s audience and the advertiser’s goals by
    showing the advertisement preferentially to the users Facebook believes will
    maximize” views, engagement with the ad, and sales. The court also rejected
    Liapes’s aiding and abetting claim, concluding there were insufficient facts
    indicating Facebook knew the advertisers engaged in discrimination or
    substantially assisted them. Finally, the court determined Liapes’s claims
    were barred by section 230 because the Audience Selection and Lookalike
    4 In 2018, Facebook entered into a settlement with the Washington
    State Attorney General, prohibiting Facebook from excluding users from
    receiving insurance ads based on race, creed, color, national origin, veteran or
    military status, sexual orientation, or disability.
    6
    Audience tools were neutral. Liapes appealed the order rather than
    amending her complaint.
    DISCUSSION
    Liapes contends the trial court erroneously sustained Facebook’s
    demurrer. When reviewing a ruling on a demurrer, we examine de novo
    whether the complaint alleges facts sufficient to state a cause of action.
    (Regents, supra, 220 Cal.App.4th at p. 558.) “We assume the truth of the
    properly pleaded factual allegations, [and] facts that reasonably can be
    inferred from those expressly pleaded.” (Ibid.) But we do not assume the
    truth of “contentions, deductions, or conclusions of law.” (Stearn v. County of
    San Bernardino (2009) 
    170 Cal.App.4th 434
    , 440.) We liberally construe the
    complaint “with a view to substantial justice between the parties,” drawing
    “all reasonable inferences in favor of the asserted claims.” (Regents, at
    p. 558; Candelore v. Tinder, Inc. (2018) 
    19 Cal.App.5th 1138
    , 1143
    (Candelore).) The plaintiff must demonstrate the court erroneously sustained
    the demurrer and “must show the complaint alleges facts sufficient to
    establish every element of each cause of action.” (Rakestraw v. California
    Physicians’ Service (2000) 
    81 Cal.App.4th 39
    , 43.) Having engaged in that
    review, we agree the demurrer should not have been sustained.
    I.
    The Unruh Civil Rights Act’s purpose is “to secure to all persons equal
    access to public accommodations ‘no matter’ ” the personal characteristics.
    (Harris v. Capital Growth Investors XIV (1991) 
    52 Cal.3d 1142
    , 1169.) It is
    intended to eradicate arbitrary, invidious discrimination in business
    establishments, and stand “as a bulwark protecting each person’s inherent
    right to ‘full and equal’ access to ‘all business establishments.’ ” (Angelucci v.
    Century Supper Club (2007) 
    41 Cal.4th 160
    , 167 (Angelucci).) Under the
    7
    statute, all persons “are entitled to the full and equal accommodations,
    advantages, facilities, privileges, or services in all business establishments of
    every kind whatsoever.” (§ 51, subd. (b).) The statute lists 14 types of
    prohibited discrimination, such as sex, race, and religion. (Ibid.) But the list
    is “illustrative, rather than restrictive” — the statute forbids discrimination
    beyond these enumerated categories. (In re Cox (1970) 
    3 Cal.3d 205
    , 212.)
    Thus, while not expressly identified, the Unruh Civil Rights Act prohibits
    arbitrary discrimination based on a person’s age — “a personal characteristic
    similar to the classifications enumerated in the Act.” (Candelore, supra,
    19 Cal.App.5th at p. 1145.) Courts liberally construe the Unruh Civil Rights
    Act to achieve its remedial purpose of deterring discriminatory business
    practices. (White v. Square, Inc. (2019) 
    7 Cal.5th 1019
    , 1025 (White).)
    Section 51.5 similarly provides “[n]o business establishment . . . shall
    discriminate against, boycott or blacklist, or refuse to buy from, contract
    with, sell to, or trade with any person in this state on account of any
    characteristic listed or defined in” the Unruh Civil Rights Act. (§ 51.5,
    subd. (a).)
    A.
    Facebook argues Liapes lacks standing to litigate her Unruh Civil
    Rights Act claim 5 because she was not injured by Facebook’s ad-targeting
    methods that excluded women and older people from viewing insurance ads.
    Since challenges to standing are jurisdictional and may be raised at any time
    in the proceeding, including for the first time on appeal as here, Facebook has
    5 Because the analysis of the Unruh Civil Rights Act claim is the
    same as the section 51.5 analysis, we refer only to the Unruh Civil Rights
    Act for ease of reference. (Semler v. General Electric Capital Corp. (2011)
    
    196 Cal.App.4th 1380
    , 1404.) But our conclusions apply equally to the Unruh
    Civil Rights Act and section 51.5 claims.
    8
    not forfeited this argument. (Qualified Patients Assn. v. City of Anaheim
    (2010) 
    187 Cal.App.4th 734
    , 751.) Its argument nonetheless fails.
    “Standing under the Unruh Civil Rights Act is broad.” 6 (Osborne v.
    Yasmeh (2016) 
    1 Cal.App.5th 1118
    , 1127.) When “any person or group of
    persons is engaged in conduct of resistance to the full enjoyment of any of the
    rights” under the Unruh Civil Rights Act, “any person aggrieved by the
    conduct may bring a civil action.” (§ 52, subd. (c).) Plaintiffs, however, may
    not sue for discrimination in the abstract; they “ ‘must actually suffer the
    discriminatory conduct.’ ” (Angelucci, 
    supra,
     41 Cal.4th at p. 175.) Thus,
    only plaintiffs who have transacted with a defendant and have been subject
    to discrimination have standing under the Unruh Civil Rights Act. (White,
    
    supra,
     7 Cal.5th at p. 1026.)
    Liapes satisfied these requirements. (Regents, supra, 220 Cal.App.4th
    at p. 558.) As a Facebook user, she has transacted with it. (White, 
    supra,
    7 Cal.5th at p. 1026.) It knows her age and gender because all users must
    provide such information as a condition of joining Facebook. Liapes was
    interested in insurance ads available on Facebook. In particular, she was
    interested in obtaining life insurance because she did not have a policy at the
    time. Moreover, she was qualified to obtain the insurance. But Facebook,
    Liapes alleged, used its Audience Selection tool, Lookalike Audience feature,
    6 Midpeninsula Citizens for Fair Housing v. Westwood Investors (1990)
    
    221 Cal.App.3d 1377
     does not hold otherwise. There, the Court of Appeal
    determined a fair housing organization was not an aggrieved person under
    the Unruh Civil Rights Act merely because the defendants’ allegedly
    discriminatory rental policy drained the organization’s limited financial
    resources from its educational and counseling services and diverted them
    toward investigating discrimination claims made against the defendants —
    which might have been a basis for standing in federal court. (Midpeninsula,
    at pp. 1382, 1385.) Organizational standing based on diversion of resources
    is not at issue here.
    9
    and ad-delivery algorithm to exclude her from receiving certain insurance ads
    because of her gender and/or age.
    The alleged injury is not conjectural or hypothetical. (Osborne v.
    Yasmeh, supra, 1 Cal.App.5th at p. 1127.) Liapes identified a life insurance
    ad that was only sent to males ages 30 to 49 because the advertiser used the
    Audience Selection tool. In another instance, a life insurance ad was not
    shown to her because it was only sent to people ages 25 to 45 — based on the
    advertiser’s use of the Audience Selection tool — and because the advertiser
    wanted to reach people similar to its customers — based on the advertiser’s
    use of the Lookalike Audience tool. Liapes further alleged, upon information
    and belief, that Facebook created thousands of Lookalike Audiences for
    insurance ads using age and gender to place users in the Lookalike
    Audiences. Because Liapes did not share characteristics with those
    Lookalike Audiences, she was less likely to receive the insurance ads or
    denied ads based on her gender and/or age. Moreover, she alleged the ad-
    delivery algorithm heavily weighted age and gender in advertising, thus
    skewing ads towards men rather than women. According to Liapes, it is
    important to immediately apply for and secure insurance offers because they
    often change or may expire. By excluding women and older people from ads,
    men and younger people obtained an advantage in the limited opportunities
    for securing insurance policies. Accepting her factual allegations as true,
    Liapes actually suffered discrimination — she was deprived of information
    regarding insurance opportunities despite being ready and able to pursue
    those opportunities. (Angelucci, supra, 41 Cal.4th at pp. 165, 175.)
    Relying on general notions about effective advertising not appearing in
    the complaint, Facebook argues Liapes is not aggrieved because advertisers
    may have and often do run different versions of ads, such as different copy or
    10
    graphics, targeted to women and older people. Facebook further faults
    Liapes for failing to identify insurance ads she actually received, noting they
    may have been more valuable to her than those to which she was denied
    access. Such inferences are not appropriate at this stage of the litigation — a
    demurrer is not “the appropriate procedure for determining the truth of
    disputed facts or what inferences should be drawn where competing
    inferences are possible.” (CrossTalk Productions, Inc. v. Jacobson (1998)
    
    65 Cal.App.4th 631
    , 635.) Moreover, according to the complaint, upon
    information and belief, the age- and gender-restricted insurance ads were not
    part of a parallel ad campaign whereby Facebook delivered the same or
    similar ads to women and older people. 7 Liapes further identified several ads
    that did not appear to her on Facebook — she had to be “informed that she
    was denied such ads because of her age and/or gender.” Because she did not
    receive these ads, she independently sought information about the insurance
    companies and services through the advertisers’ websites, not Facebook. Her
    allegations sufficiently alleged an injury for standing purposes. (Angelucci,
    
    supra,
     41 Cal.4th at p. 167.)
    B.
    Facebook next contends Liapes failed to state a claim under the Unruh
    Civil Rights Act. Facebook argues it does not engage in intentional
    discrimination; rather, it has neutral practices that, at most, have a
    disparate negative impact on the protected classes of gender and age.
    (Koebke v. Bernardo Heights Country Club (2005) 
    36 Cal.4th 824
    , 854
    7 We do not disregard these allegations, as Facebook urges, simply
    because they are based “upon information and belief.” Allegations concerning
    matters “ ‘peculiarly within the knowledge of the adverse party,’ ” as is the
    case here, may be pleaded in this manner. (Dey v. Continental Central Credit
    (2008) 
    170 Cal.App.4th 721
    , 725, fn. 1.)
    11
    (Koebke).) Because the Unruh Civil Rights Act only reaches business
    practices that constitute intentional, invidious discrimination — not neutral
    practices that disparately impact protected groups — Facebook argues
    Liapes’s claim is fatally flawed. (Ibid.) We disagree.
    To state a claim under the Unruh Civil Rights Act, a plaintiff must
    allege the defendant is a business establishment that intentionally
    discriminates against and/or denies plaintiff full and equal treatment of a
    service, advantage, or accommodation based on plaintiff’s protected status.
    (§§ 51, subd. (b), 51.5; Candelore, supra, 19 Cal.App.5th at pp. 1144–1146;
    Martinez v. Cot’n Wash, Inc. (2022) 
    81 Cal.App.5th 1026
    , 1036 [“Unless an
    Unruh Civil Rights Act claim is based on an [Americans with Disabilities Act
    of 1990] violation,” a plaintiff must prove intentional discrimination].)
    Intentional discrimination requires “ ‘willful, affirmative misconduct.’ ”
    (Koebke, 
    supra,
     36 Cal.4th at p. 853.) And plaintiffs must allege more than
    the disparate impact of a facially neutral policy on a particular protected
    group. (Id. at p. 854.)
    Construing the complaint liberally and drawing all reasonable
    inferences in favor of the asserted claims, Liapes has stated an Unruh Civil
    Rights Act claim. (Regents, supra, 220 Cal.App.4th at p. 558.) Facebook
    qualifies as a business establishment. (White, 
    supra,
     7 Cal.5th at p. 1032
    [Unruh Civil Rights Act prohibits discrimination by online businesses].) And
    it does not dispute women and older people were categorically excluded from
    receiving various insurance ads — an admitted service of Facebook — on its
    platform. (Candelore, supra, 19 Cal.App.5th at p. 1152 [people are entitled to
    full and equal accommodations and services in all business establishments of
    every kind, including less essential commercial services].)
    12
    Liapes further alleged Facebook engaged in intentional discrimination
    by designing and employing ad tools that expressly make distinctions based
    on gender and age when creating the target audience for insurance ads.
    (Koire v. Metro Car Wash (1985) 
    40 Cal.3d 24
    , 35–36 [discount program for
    women violated Unruh Civil Rights Act because it singled-out customers
    based on protected class status, without any compelling societal interest].)
    Facebook, not the advertisers, classifies users based on their age and gender.
    Advertisers using the Audience Selection tool are required to identify the age
    and gender preferences for their target audience. While the default audience
    setting is 18 to 65 years of age and older and all genders, Facebook provides
    advertisers with the option of easily including or excluding entire groups
    from the target audience by checking categories on a drop-down menu.
    Moreover, Facebook encourages advertisers to target users based on age and
    gender. It urges advertisers to “ ‘[t]hink about what [your customers] like,
    how old they are and the interests they have. This can help you identify
    audience options that will help you reach people like them on Facebook.’ ”
    Facebook explains, if “ ‘the majority of your current customers are women, it
    might be a good idea to set your audience to reach women and exclude men.’ ”
    And insurance advertisers allegedly excluded protected categories of
    persons — Liapes identified several insurance ads she did not receive
    because she was expressly outside the Audience Selection parameters for
    age or gender, thus requiring her to independently search for insurance
    opportunities. (See, e.g., Smith v. BP Lubricants USA Inc. (2021)
    
    64 Cal.App.5th 138
    , 151 [allegation that employee made three racist
    comments to plaintiff was sufficient to allege intentional discrimination
    under Unruh Civil Rights Act].)
    13
    To the extent Facebook argues it was not responsible for any unequal
    treatment Liapes experienced because it merely followed the advertisers’
    selections, we disagree. The complaint alleged Facebook presents advertisers
    the opportunity to discriminate based on gender and age. (Cf. Fair Housing
    Coun., San Fernando v. Roommates.com, LLC (9th Cir. 2008) 
    521 F.3d 1157
    ,
    1164, 1167 (Roommates) [website could violate nondiscrimination laws by
    providing users the option to choose between nondiscriminatory and
    discriminatory preferences when searching for housing].) Facebook, rather
    than the advertisers, sends the ads to users within the Audience Selection
    parameters. Facebook retains the discretion and ability to approve and send
    an ad that includes age- or gender-based restrictions. Thus, Liapes alleged,
    whenever Facebook delivers an age- or gender-restricted ad, Facebook
    knowingly sends or publishes an ad that discriminates.
    Allegations regarding the Lookalike Audience tool further indicate
    Facebook intentionally uses gender and age when targeting ads. For
    example, it is Facebook that creates the Lookalike Audience resembling the
    advertiser’s sample audience. When analyzing the characteristics of the
    sample audience to determine the larger Lookalike Audience, Facebook
    directly relies on the users’ age and gender. This occurs regardless of
    whether the advertiser has created a sample audience with age or gender
    exclusions. Thus, while an advertiser provides Facebook with the sample
    audience, “the rest of the work to create the Lookalike Audience is done
    entirely by Facebook,” and it is that work that ultimately results in ad denial.
    After the audience is selected, the ad-delivery algorithm — determining
    which users within a particular audience will receive ads — is no different.
    According to the complaint, both age and gender are weighted more heavily
    than other characteristics or data points. More importantly, Facebook uses
    14
    age and gender to determine who will receive the ads, regardless of whether
    the advertiser directs Facebook to limit the age or gender of recipients. Thus,
    even if advertisers do not limit their audience to a specific gender or age,
    Facebook makes those distinctions on behalf of advertisers via the ad-delivery
    algorithm. As a result, Liapes was unable to view several insurance ads, even
    when advertisers did not expressly exclude women and older people.
    We agree with Liapes that the trial court erred when it disregarded her
    allegations about the algorithm. We discern no inconsistency between her
    allegations in the original complaint regarding the purpose of the ad-delivery
    algorithm — to optimize both the ad’s audience and the advertiser’s goals —
    and those in her first amended complaint — to increase the likelihood
    Facebook users will click on each ad because revenue increases when users
    click more often on ads. These allegations reinforce each other. Over 98
    percent of Facebook’s revenue comes from advertisers who pay to publish ads.
    According to the complaint, Facebook wants ads to be as “ ‘relevant’ ” as
    possible to ensure users spend more time on Facebook and allow it to sell and
    place more ads. Because Facebook increases its revenue when users engage
    with ads, it has the incentive to optimize the audience for those ads. These
    are not conflicting factual allegations and did not warrant the court’s
    disregard. (Panterra GP, Inc. v. Superior Court (2022) 
    74 Cal.App.5th 697
    ,
    730 [if an amended complaint contains facts contradicting an earlier
    complaint in the same lawsuit, a court can take judicial notice of the
    inconsistent statements and disregard the conflicting factual allegations].)
    More importantly, that the ad-delivery algorithm may serve a
    legitimate business interest, such as optimizing an ad’s audience or
    connecting users to ads, is not fatal to Liapes’s Unruh Civil Rights Act claim.
    “[L]egitimate business interests may justify limitations on consumer access to
    15
    public accommodations.” (Harris v. Capital Growth Investors XIV, 
    supra,
    52 Cal.3d at p. 1162.) But while businesses can make economic distinctions
    in serving customers, those distinctions must be based on characteristics that
    “could conceivably be met by any customer” — not personal characteristics.
    (Id. at p. 1163.) For example, discounts based on gender violate the Unruh
    Civil Rights Act, but discounts “to any customer who meets a condition which
    any patron could satisfy (e.g., presenting a coupon, or sporting a certain color
    shirt or a particular bumper sticker)” are permissible. (Koire v. Metro Car
    Wash, supra, 40 Cal.3d at p. 36.) The “quest for profit maximization can
    never serve as an excuse for prohibited discrimination among potential
    customers.” (Candelore, supra, 19 Cal.App.5th at p. 1153.) Thus, a
    defendant who pursues discriminatory practices motivated by “ ‘rational self-
    interest,’ ” such as economic gain, nonetheless violates the Unruh Civil
    Rights Act. 8 (Marina Point, Ltd. v. Wolfson (1982) 
    30 Cal.3d 721
    , 740–741,
    fn. 9.) On demurrer, the critical issue here is whether Liapes sufficiently
    alleged Facebook’s ad platform discriminates against a protected class, such
    8 Distinctions, such as those based on age, are unlawful if they
    constitute “ ‘arbitrary, invidious or unreasonable discrimination.’ ” (Javorsky
    v. Western Athletic Clubs, Inc. (2015) 
    242 Cal.App.4th 1386
    , 1398.)
    Differential treatment is reasonable and nonarbitrary if there is a strong
    public policy in favor of the distinctions. (Ibid; Sargoy v. Resolution Trust
    Corp. (1992) 
    8 Cal.App.4th 1039
    , 1044 [bank offering older people savings
    accounts with higher interest rates was not arbitrary discrimination because
    it served policy considerations such as elderly people having limited incomes,
    inability to work due to health problems as articulated in a myriad of
    statutes].) Facebook does not argue its allegedly discriminatory ad platform
    is justified by any public policy.
    16
    as women and older people, even if in pursuit of those legitimate business
    goals. We conclude she has. 9
    The foregoing makes clear that Liapes alleged intentional
    discrimination, not disparate impact as Facebook asserts. Disparate impact
    analysis “relies on the effects of a facially neutral policy on a particular
    group.” (Koebke, 
    supra,
     36 Cal.4th at p. 854.) Specifically, it requires
    inferring discriminatory intent solely from those effects. (Ibid.) Here, by
    contrast, Liapes alleged Facebook crafted tools such as the Lookalike
    Audience and ad-delivery algorithm that expressly rely on users’ age and
    gender; i.e., they are not facially neutral. Those characteristics are then used
    to exclude women and older people from receiving insurance ads. Finally,
    while a disparate impact analysis does not apply to Unruh Civil Rights Act
    claims, nothing precludes “the admission of relevant evidence of disparate
    impact in Unruh Act cases” because it “may be probative of intentional
    discrimination.” (Harris v. Capital Growth Investors XIV, 
    supra,
     52 Cal.3d at
    p. 1175.) Such evidence exists here — Liapes alleged Facebook’s ad platform
    has a significant skew in delivery along gender lines. Combined with
    allegations that Facebook expressly relies on gender and age to determine the
    9 In disputing this conclusion, Facebook refers repeatedly to
    information outside the pleadings. For example, Facebook asserts its policies
    expressly forbid advertisers from discriminating based on protected
    attributes. And it further suggests that Liapes might have received parallel
    ads that “may well” have been “more ‘valuable’ ” to her. Finally, Facebook
    asserts Liapes’s references to its training materials have been taken out of
    context. Whatever the merits of these arguments may be, Facebook ignores
    that, on demurrer, we test the pleadings alone. (SKF Farms v. Superior
    Court (1984) 
    153 Cal.App.3d 902
    , 905.) The only issue “is whether the
    complaint, as it stands, unconnected with extraneous matters, states a cause
    of action.” (Ibid.) Facebook should rest assured it will be able to develop the
    record and its arguments further — just not at this stage of the litigation.
    17
    audience for its ads, the complaint raises a plausible inference Facebook
    treated Liapes unequally because of her gender and age — a valid Unruh
    Civil Rights Act claim of intentional discrimination by a business
    establishment. (Doheny Park Terrace Homeowners Assn., Inc. v. Truck Ins.
    Exchange (2005) 
    132 Cal.App.4th 1076
    , 1099.)
    C.
    Facebook also contends Liapes failed to state an Unruh Civil Rights Act
    claim under an aiding and abetting theory of liability because she does not
    adequately allege it acted with an intent to facilitate discriminatory conduct.
    We disagree.
    A person who aids and abets the commission of an offense, such as an
    intentional tort, may be liable if the person “ ‘knows the other’s conduct
    constitutes a breach of duty and gives substantial assistance or
    encouragement to the other to so act’ ” or “ ‘gives substantial assistance to the
    other in accomplishing a tortious result and the person’s own conduct,
    separately considered, constitutes a breach of duty to the third person.’ ”
    (Fiol v. Doellstedt (1996) 
    50 Cal.App.4th 1318
    , 1325–1326.) A person can be
    liable for aiding and abetting violations of civil rights laws. (Cf. Alch v.
    Superior Court (2004) 
    122 Cal.App.4th 339
    , 389 [aiding and abetting theory
    of liability applies to Fair Employment and Housing Act claims].)
    The complaint satisfied these elements. It adequately alleged Facebook
    knew insurance advertisers intentionally targeted its ads based on users’
    ages and gender — as explained above, a violation of the Unruh Civil Rights
    Act. (Casey v. U.S. Bank Nat. Assn. (2005) 
    127 Cal.App.4th 1138
    , 1149
    [requiring plaintiff to first identify the violation for which plaintiff seeks to
    hold defendant liable].) According to Liapes, the coding in Facebook’s
    platform identifies each type of business, including insurance advertisers,
    18
    that purchases ads. In addition, Facebook is aware of the ad’s subject matter,
    including insurance ads. Facebook was aware ads contained age- or gender-
    based restrictions because it alone approved and sent the ads to the target
    audience. (Schulz v. Neovi Data Corp. (2007) 
    152 Cal.App.4th 86
    , 94
    [allegation defendant knew they were facilitating orders for unlawful
    pyramid scheme satisfied knowledge requirement for aiding and abetting
    claim].) Thus, Liapes alleged, Facebook knew older people and women were
    being discriminated against with regard to the provision of insurance ads.
    (Casey v. U.S. Bank Nat. Assn., supra, 127 Cal.App.4th at p. 1145 [liability
    for aiding and abetting depends on proof the defendant had actual knowledge
    of the specific primary wrong the defendant substantially assisted].)
    The complaint also sufficiently alleged the element of substantial
    assistance or encouragement. (Fiol v. Doellstedt, supra, 50 Cal.App.4th at
    p. 1326.) Each time an advertiser used the Audience Selection tool and made
    a discriminatory targeting decision based on age or gender, Facebook
    followed the selected audience parameters. Indeed, this occurred despite
    Facebook retaining the discretion to reject ads that include age- or gender-
    based restrictions. Facebook further maintained the age and gender
    Audience Selection criteria despite its awareness advertisers were making
    discriminatory advertising choices. (Schulz v. Neovi Data Corp., supra,
    152 Cal.App.4th at p. 94 [defendant substantially assisted and encouraged
    illegal conduct by allowing configuring of website to authorize processing of
    credit card payments].) Although the default setting for the Audience
    Selection tool is for all genders and people over the age of 18, Facebook
    encourages advertisers to narrow the gender of the users who will receive ads
    to make them more effective. In one instance, Facebook stated if “ ‘the
    19
    majority of your current customers are women, it might be a good idea to set
    your audience to reach women and exclude men.’ ”
    Facebook nonetheless argues Liapes must also plead it had the specific
    intent to facilitate the advertisers’ Unruh Civil Rights Act violations. (See,
    e.g., Gerard v. Ross (1988) 
    204 Cal.App.3d 968
    , 983.) We need not decide
    whether this is a required element for aiding and abetting liability — read
    liberally, the complaint alleges Facebook intended to assist the insurance
    advertisers in excluding women and older people from receiving their ads.
    (Nasrawi v. Buck Consultants LLC (2014) 
    231 Cal.App.4th 328
    , 345.) Liapes
    alleged Facebook encourages, facilitates, expects, and wants advertisers to
    routinely exclude older persons and women from their Audience Selections so
    they will not receive ads on insurance opportunities. “Fairly read, that
    allegation indicates intent to participate” in the illegal activity. (Ibid.)
    In sum, the trial court erred in sustaining Facebook’s demurrer to
    Liapes’s complaint.
    II.
    Liapes contends section 230 does not immunize Facebook from liability
    because it acted as a content provider. We agree.
    Section 230 “immunizes providers of interactive computer services
    against liability arising from content created by third parties.” (Roommates,
    
    supra,
     521 F.3d at p. 1162, fn. omitted.) It states, in relevant part, “[n]o
    provider or user of an interactive computer service” — meaning “any
    information service, system, or access software provider that provides or
    enables computer access by multiple users to a computer server” — shall “be
    treated as the publisher or speaker of any information provided by another
    information content provider.” (
    47 U.S.C. § 230
    , subds. (c)(1), (f)(2).) These
    provisions convey “an intent to shield Internet intermediaries from the
    20
    burdens associated with defending against state law claims that treat them
    as the publisher or speaker of third party content.” (Hassell v. Bird (2018)
    
    5 Cal.5th 522
    , 544.) “ ‘The prototypical service qualifying for [CDA]
    immunity is an online messaging board (or bulletin board) on which Internet
    subscribers post comments and respond to comments posted by others.’ ”
    (Dyroff v. Ultimate Software Group, Inc. (9th Cir. 2019) 
    934 F.3d 1093
    , 1097,
    brackets in original.)
    But an interactive computer service provider only has immunity if it is
    not also the information content provider — that is, someone “responsible, in
    whole or in part, for the creation or development” of the content at issue.
    (
    47 U.S.C. § 230
    , subd. (f)(3); Roommates, 
    supra,
     521 F.3d at p. 1162.)
    Passively displaying content “created entirely by third parties” renders the
    operator only a service provider “with respect to that content.” (Roommates,
    at p. 1162.) “But as to content that it creates itself, or is ‘responsible, in
    whole or in part’ for creating or developing, the website is also a content
    provider.” (Ibid.) “Thus, a website may be immune from liability for some of
    the content it displays to the public but be subject to liability for other
    content.” (Id. at pp. 1162–1163.)
    Roommates — concluding a website matching people renting spare
    rooms with others seeking housing was not entitled to section 230
    immunity — is instructive. (Roommates, 
    supra,
     521 F.3d at p. 1165.)
    The website required users to state the gender, sexual orientation, and
    familial status of their desired tenants. (Id. at p. 1161.) The website
    operator then used those preferences to determine which postings were
    shown to other users based on their selections from drop-down menus and
    pre-populated lists. (Id. at pp. 1161–1162, 1165.) By eliciting information
    about protected characteristics and thereafter using that information to
    21
    determine postings other users could view, the website operator was partially
    responsible for the development of allegedly illegal content. (Id. at pp. 1165,
    1167.) The court concluded section 230 “does not grant immunity for
    inducing third parties to express illegal preferences.” (Roommates, at
    p. 1165.)
    There is little difference with Facebook’s ad tools. Like the website at
    issue in Roommates, Facebook requires users to disclose their age and gender
    before they can use its services. (Roommates, 
    supra,
     521 F.3d at p. 1161.) It
    designed and created an advertising system, including the Audience Selection
    tool, that allowed insurance companies to target their ads based on certain
    characteristics, such as gender and age. (Vargas v. Facebook, Inc. (9th Cir.,
    June 23, 2023, No. 21-16499) 2023 U.S.App. Lexis 15796 (Vargas);
    Roommates, at p. 1161; Allen v. City of Sacramento (2015) 
    234 Cal.App.4th 41
    , 64, fn. 4 [authorizing citation and reliance on unpublished federal court
    decisions as persuasive authority].) Although there are thousands of
    characteristics advertisers may choose to identify their target audiences,
    Facebook requires advertisers to select age and gender parameters. Each
    category includes “simple drop-down menus and toggle buttons to allow”
    insurance advertisers “to exclude protected categories of persons.” (Vargas,
    2023 U.S.App. Lexis 15796, p *7; Roommates, at p. 1161.) Insurance
    advertisers then “allegedly used the tools to exclude protected categories of
    persons from seeing some advertisements.” (Vargas, 2023 U.S.App. Lexis
    15796, p. *7.) Facebook “identified persons in protected categories and
    offered tools that directly and easily allowed advertisers to exclude all
    persons of a protected category (or several protected categories).” (Vargas,
    2023 U.S.App. Lexis 15796, p. *9.) In doing so, Facebook does not merely
    proliferate and disseminate content as a publisher. (Kimzey v. Yelp! Inc. (9th
    22
    Cir. 2016) 
    836 F.3d 1263
    , 1271.) It creates, shapes, or develops content “by
    materially contributing” to the content’s alleged unlawfulness. (Roommates,
    at pp. 1167–1168.)
    These circumstances are distinguishable from those in Prager
    University v. Google LLC (2022) 
    85 Cal.App.5th 1022
    . In that case, the
    defendant video sharing website restricted access to videos based on certain
    criteria regarding the content, such as talking about drug use or abuse,
    overly detailed conversations or depictions of sexual activity, and
    inappropriate language. (Id. at p. 1029.) The plaintiff alleged defendant
    violated the Unruh Civil Rights Act, among other statutes, by restricting
    access to the plaintiff’s generally politically conservative videos based on its
    political viewpoint rather than the content falling into any restricted
    categories. (Prager, at p. 1033.) The court determined the plaintiffs were
    challenging the defendants’ editorial decisions regarding restricting,
    restraining, and censoring content — all traditional publication decisions to
    which section 230 immunity attached. (Prager, at p. 1033.) There were no
    allegations, as here, that the defendant created a system that actively shaped
    the audience based on protected characteristics.
    Facebook’s Lookalike Audience tool and ad-delivery algorithm
    underscore its role as a content developer. According to the complaint,
    Facebook uses its internal data and analysis to determine what specific
    people will receive ads. The algorithm relies heavily on age and gender to
    determine which users will actually receive any given ad. This occurs even if
    an advertiser did not expressly exclude certain genders or older people. The
    algorithm then sends or excludes users from viewing ads based on protected
    characteristics such as age and gender. Because the algorithm ascertains
    data about a user and then targets ads based on the users’ characteristics,
    23
    the algorithm renders Facebook more akin to a content developer. (Vargas,
    supra, 2023 U.S.App. Lexis 15796, p. *8.) Facebook is not entitled to section
    230 immunity for the claims here.
    Disputing this conclusion, Facebook argues its ad tools are neutral
    because third parties, not Facebook, create the allegedly illegal content.
    True, providing neutral tools to users to make illegal or unlawful searches
    does not constitute “ ‘development’ ” for immunity purposes. (Roommates,
    supra, 521 F.3d at p. 1169.) But the system must do “ ‘absolutely nothing to
    enhance’ ” the unlawful message at issue “beyond the words offered by the
    user.” (Kimzey v. Yelp! Inc., supra, 836 F.3d at p. 1270.) For example,
    “a housing website that allows users to specify whether they will or will not
    receive emails by means of user-defined criteria might help some users
    exclude email from other users of a particular race or sex.” (Roommates, at
    p. 1169.) “However, that website would be immune, so long as it does not
    require the use of discriminatory criteria.” (Ibid., italics added.) Here, Liapes
    alleged Facebook “does not merely provide a framework that could be utilized
    for proper or improper purposes.” (Roommates, at p. 1172.) Rather,
    Facebook, after requiring users to disclose protected characteristics of age
    and gender, relied on “unlawful criteria” and developed an ad targeting and
    delivery system “directly related to the alleged illegality” — a system that
    makes it more difficult for individuals with certain protected characteristics
    to find or access insurance ads on Facebook. (Id. at pp. 1167, 1172; compare
    with Carafano v. Metrosplash.com, Inc. (9th Cir. 2003) 
    339 F.3d 1119
    , 1125
    [website operator was not involved with user’s decision to enter a fake profile
    in a dating service, the illegal activity at issue]; Dyroff v. Ultimate Software
    Group, Inc., supra, 934 F.3d at p. 1099 [website entitled to § 230 immunity
    from claims it permitted trafficking illegal narcotics where recommendation
    24
    and notification functions were based off of information users provided in
    blank text boxes rather than a requirement that users disclose certain
    characteristics].) That third-party advertisers are the content providers does
    not preclude Facebook “from also being an information content provider by
    helping ‘develop’ at least ‘in part’ the information” at issue here, contrary to
    Facebook’s assertions. (Roommates, at p. 1165 [“the party responsible for
    putting information online may be subject to liability, even if the information
    originated with a user”].)
    DISPOSITION
    We conclude, liberally construing the complaint and drawing all
    reasonable inferences in favor of its claims, Liapes alleged facts sufficient to
    state a cause of action. The judgment is reversed. Liapes is entitled to her
    costs on appeal.
    25
    _________________________
    Rodríguez, J.
    WE CONCUR:
    _________________________
    Tucher, P. J.
    _________________________
    Fujisaki, J.
    A164880
    26
    Trial Court: San Mateo County Superior Court
    Trial Judge: Hon. V. Raymond Swope
    Counsel:
    Gupta Wessler, Jennifer D. Bennett, Linnet Davis-Stermitz, Peter Romer-
    Friedman, Matthew W.H. Wessler; Law Offices of William Most, William
    Brock Most; Aqua Terra Aeris Law Group, Jason R. Flanders; Outten &
    Golden, Jahan C. Sagafi, Adam T. Klein, Pooja Shethji; Peter Romer-
    Friedman Law and Peter Romer-Friedman for Plaintiffs and Appellants.
    David Brody, Jon Greenbaum, Sanaa Ansari; Amanda Goad; Olga Akselrod,
    Linda S. Morris; and Jacob Snow for Lawyers’ Committee for Civil Rights
    Under Law, ACLU Foundation of Southern California, ACLU Foundation,
    ACLU Foundation of Northern California and Upturn as Amici Curiae on
    behalf of Plaintiffs and Appellants.
    Gibson, Dunn & Crutcher, Rosemarie T. Ring, Ryan Azad, Theodore J.
    Boutrous, Jr., Bradley J. Hamburger and Matt Aidan Getz for Defendant and
    Respondent.
    27
    

Document Info

Docket Number: A164880

Filed Date: 9/21/2023

Precedential Status: Precedential

Modified Date: 9/21/2023