• Tetris reveals how people respond to unf

    From ScienceDaily@1:317/3 to All on Mon May 15 22:30:18 2023
    Tetris reveals how people respond to unfair AI

    Date:
    May 15, 2023
    Source:
    Cornell University
    Summary:
    An experiment in which two people play a modified version of
    Tetris revealed that players who get fewer turns perceived the
    other player as less likable, regardless of whether a person or
    an algorithm allocated the turns.


    Facebook Twitter Pinterest LinkedIN Email

    ==========================================================================
    FULL STORY ==========================================================================
    A Cornell University-led experiment in which two people play a modified
    version of Tetris revealed that players who get fewer turns perceived
    the other player as less likable, regardless of whether a person or an algorithm allocated the turns.

    Most studies on algorithmic fairness focus on the algorithm or the
    decision itself, but researchers sought to explore the relationships
    among the people affected by the decisions.

    "We are starting to see a lot of situations in which AI makes decisions
    on how resources should be distributed among people," said Malte Jung, associate professor of information science, whose group conducted the
    study. "We want to understand how that influences the way people perceive
    one another and behave towards each other. We see more and more evidence
    that machines mess with the way we interact with each other." In an
    earlier study, a robot chose which person to give a block to and studied
    the reactions of each individual to the machine's allocation decisions.

    "We noticed that every time the robot seemed to prefer one person,
    the other one got upset," said Jung. "We wanted to study this further,
    because we thought that, as machines making decisions becomes more a part
    of the world -- whether it be a robot or an algorithm -- how does that
    make a person feel?" Using open-source software, Houston Claure -- the
    study's first author and postdoctoral researcher at Yale University -- developed a two-player version of Tetris, in which players manipulate
    falling geometric blocks in order to stack them without leaving gaps
    before the blocks pile to the top of the screen.

    Claure's version, Co-Tetris, allows two people (one at a time) to work
    together to complete each round.

    An "allocator" -- either human or AI, which was conveyed to the players -
    - determines which player takes each turn. Jung and Claure devised their experiment so that players would have either 90% of the turns (the "more" condition), 10% ("less") or 50% ("equal").

    The researchers found, predictably, that those who received fewer turns
    were acutely aware that their partner got significantly more. But they
    were surprised to find that feelings about it were largely the same
    regardless of whether a human or an AI was doing the allocating.

    The effect of these decisions is what the researchers have termed "machine allocation behavior" -- similar to the established phenomenon of "resource allocation behavior," the observable behavior people exhibit based on allocation decisions. Jung said machine allocation behavior is "the
    concept that there is this unique behavior that results from a machine
    making a decision about how something gets allocated." The researchers
    also found that fairness didn't automatically lead to better game play
    and performance. In fact, equal allocation of turns led, on average,
    to a worse score than unequal allocation.

    "If a strong player receives most of the blocks," Claure said, "the team
    is going to do better. And if one person gets 90%, eventually they'll
    get better at it than if two average players split the blocks."
    * RELATED_TOPICS
    o Mind_&_Brain
    # Consumer_Behavior # Behavior # Social_Psychology #
    Perception
    o Computers_&_Math
    # Video_Games # Neural_Interfaces # Robotics #
    Computer_Programming
    * RELATED_TERMS
    o Massively_multiplayer_online_game o Milgram_experiment o
    Alan_Turing o Random_variable o Early_childhood_education o
    Familiarity_increases_liking o Tinnitus o Social_inclusion

    ========================================================================== Story Source: Materials provided by Cornell_University. Original written
    by Tom Fleischman, courtesy of the Cornell Chronicle. Note: Content may
    be edited for style and length.


    ========================================================================== Journal Reference:
    1. Houston Claure, Seyun Kim, Rene' F. Kizilcec, Malte Jung. The social
    consequences of Machine Allocation Behavior: Fairness, interpersonal
    perceptions and performance. Computers in Human Behavior, 2023;
    146: 107628 DOI: 10.1016/j.chb.2022.107628 ==========================================================================

    Link to news story: https://www.sciencedaily.com/releases/2023/05/230515131943.htm

    --- up 1 year, 11 weeks, 10 hours, 50 minutes
    * Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1:317/3)