The Future of Feeling

 
The Future of Feeling Book Cover
 
 

The Future of Feeling: Building Empathy in a Tech-Obsessed World
By: Kaitlin Ugolik Phillips

Here we go. [I moved my conclusion to the top in case you don't feel like reading ALL of my observations]

In summation: it's a super relevant, important, and interesting premise that derailed pretty quickly and explosively into an unhelpful, sketchy, politically charged dissertation with no real solutions. You're probably better off finding a different book on empathy, but if you're interested in VR and the Democratic platform, you'll probably love this.

"Without tone of voice, facial expressions, or any real accountability for what we say, even those of us with the best intentions can have a hard time remembering the humanity of people on the other side of the keyboard."

This was the sentiment I was looking forward to reading about. After having read this book I find the book summary highly deceiving in what it actually talks about. I absolutely agree that technology has decreased empathy and our ability to communicate well with others. I was hoping this book would explore all the ways that happens, explain the psychology and sociology behind it, and offer practical solutions to help rectify this alarming knowledge.

However, the author spent at least half of the book detailing a variety of VR endeavors, some not even clearly connected to empathy. The other half was spent largely on AI and robots. It was interesting to think about how those could help build empathy, but frankly, the research isn't very convincing. It was obvious that she wasn't convinced either and was quick to offer disclaimers or pose questions challenging technology's ability to do what it intends to infallibly.

I found her writing polarizing, somewhat irresponsible, pushing an agenda, and lacking in credibility. I was skeptical of much of the research and statistics she presented and wouldn't be surprised if she misrepresented it to skew and support a particular point. She seemed to stray from her proposed thesis and got bogged down in tech exploration instead of helping us be better empathizers.

One big hang-up for me was how politically charged this book ended up being. The way she presented all of her examples was very pointed and came across like: people who disagree with her viewpoint need to work on their empathy so that they will eventually realize that they were wrong- in particular... Republicans/Trump supporters.

Understanding people's feelings is vital to compassion and treating people well. We absolutely need to try to understand other perspectives and recognize how other people are feeling. That is empathy. But Phillips (and I believe a lot of society as a whole) goes too far in placing feelings as the authority to truth and beliefs. We need to care about people's feelings but we can't form our basis of morality on feelings- for a number of reasons.

Empathy is still our interpretation and perception of what other people are feeling. Feelings are too complicated to know if you are truly understanding someone; feelings are deceptive and often lead us astray, we can't always trust them; feelings are highly individualistic. My feelings on any number of issues will almost always oppose someone else's feelings- how can you determine right and wrong based on feelings?

And yet, all of her writing seems to indicate that she equates feelings with morality. One example that stands out to me is her description of the Planned Parenthood VR that places you in the position of a woman entering the clinic to actual audio of protestors outside yelling obscene things. She experienced the VR and said that she wondered whether an experience like this would really change someone's mind if they were truly opposed to Planned Parenthood's activities. I have so many things to say about this, but will attempt to condense. First of all- the VR was really targeting the lack of empathy of the protestors- and I don't think any decent human being would support the things the protestors were yelling- it was rude and evil. Empathizing with women who go into Planned Parenthood would cause you to care about their feelings and want to support them emotionally or financially, to reconsider how you generalize women who go there, and realize how alone they probably feel, but why would it cause you to all of a sudden abandon the belief that abortion is the killing of babies? What about empathy for the unborn life that is terminated? Would people who support Planned Parenthood change their beliefs if they experienced a VR depicting how a baby feels as it is being aborted?

So no, feelings cannot be a basis of morality. Empathy is not an authority for truth but rather a tool to better care for and relate to others.

In a book on empathy, the author was actually pretty bad at empathizing with people she doesn't really like. She presented it as her empathizing but it was obviously not genuine. Here is her being 'empathetic' to tech giants like Bezos and Zuckerberg having helped create something potentially detrimental to socialization: "it didn't release them from their responsibility, but it released me from expecting something different... if I had their demographics, background, experience, and privilege I might never even think about the impact AI could have on my resume or rap sheet." Apparently empathy is pigeonholing someone based on their demographics and background, passive-aggressively blaming them, and then saying 'Oh, they couldn't help it so I can't be mad at them. They're not as woke as me.'

There were almost no directives to help us be more aware of how we interact with people using technology or in the presence of technology; instead she focused on tech companies- either blaming them or describing how they are all trying to 'fix' the problem by tweaking existing products or creating new (mostly VR) experiences. But doesn't she see that the cure for empathy is in individuals ? I'm going to go out on a limb and speculate that she doesn't operate from a worldview that believes in people's sin nature and need for a Savior, but I would argue that pursuing what I would call biblical empathy is more effective because I have to look at others as image-bearers of my Creator who loves every single other person the same amount He loves me. It humbles you and puts you on the same plain as everyone else. It provides the 'why' for empathy in an already established framework of morality that frees you to focus on caring for people on a foundation of truth.

Another concern in the research she presents is the aspect of manipulation. While she does address the dangers of particular tech options, I don't think it was stressed enough. VR may create empathy for other people, but are the environments real, are the situations completely accurate? Or are the creators trying to evoke the feeling or belief they want you to have? How could you ever regulate that? She says, "Could it be fake if the emotion it evoked in you was real?" Uhhh….. YES! Evoking an intended emotion in someone does not make the means to that end genuine, truthful, or moral.

Also, and obviously, in VR you can't really grasp everything that person is feeling. There are way too many factors, experiences, and motivators that influence our feelings.

Welp.
Turns out I'm bad at condensing.

I'll stop now.

Previous
Previous

Deeper Water

Next
Next

The Story of Arthur Truluv