top of page
Search

Sentiment Analysis Exposed

Updated: Jun 1, 2022

I made it with Max last night! Mary texts her BFF Shelly.


OMG! Welcome to womanhood!! How was it/he? Shelly texts back.


And right about now, Mary's mom gets a 'notification' on her cell phone that her daughter is texting sexual references, then displays Mary's texts with Shelly upon mom's request.


Mom spends the rest of the day at work fuming, conjuring dialog with her daughter for later that evening when they'll be home together. Mom doesn't like Max. Never did, and she'd told Mary not to see him.



Sentiment Analysis (SA) is the latest hip, slick, and trendy software in the Valley. In a nutshell, it classifies text messages, tweets, blogs, FB and social media updates into categories such as 'positive, angry, happy, sad, drug and sexual references'...etc. This information is purchased by businesses, or political campaigns, to determine public sentiment about everything, from Joe Biden to your business. SA has been going on for quite some time, but only recently has it filtered down to the individual level, and this is where it becomes dangerous.


Along with analyzing public sentiment, now you can analyze your employee, child or spouse's text messages, whether they know it or not, if the phone bill is in your name and you pay the monthly SA app fee. You are instantly notified through your computer or cellphone if your child is texting about drugs, sex, or perhaps chatting with someone you don't like, or bullying, or getting bullied.


You are notified if your employee is texting his wife instead of attending to your customers. You can assess your husbands mood before seeing him that evening from the SA report on the aggression level of his texts for the day through Natural Language (NLP) software, that proports to 'understand' the contextual meaning of our language usage.


Smartphones now have built-in software that tracks all activity, including motion, i.e. whether you're sitting, walking, driving...etc. SA services not only alert their customers of “inappropriate behavior,” but allows them to remotely control all the phones on their plan as well. At work, or on the go, you can now monitor and disrupt behavior 24/7. Don't like the boy your daughter does. Block his number. Don't care for a site your employee visits. Block access.


SA service companies are popping up everywhere selling all kinds of crap:

'Track your employees for greater staff commitment and productivity!'

'Tracking your kid is for the child's safety.' In other words, their marketing that their SaaS will not only see what your teen is texting, but their SA classifers will tell you how they are feeling, and why. In fact, with their NLP tracking software, they'll even help you get to know your kid.


Hmm...While being able to shut your kid's phone down when they're texting while driving is unarguable beneficial, using SA software to spy on, and then rely on it to assess behavior is irresponsible at best.


If your child is texting while driving, their irresponsible behavior is most assuredly reflected in others aspects of their life. Invest the [life]time it'll take you to teach your kids right from wrong, and you'll know, or at least trust that what they text is in appropriate range. Talk to your kids and you'll learn what's important to them and why, and you may just be wrong about who is good and bad for them. Get to know your employees, what your direct reports need from you to inspire them to excel. If performance is optimal, what the hell difference does extraneous phone or online usage make.


Part of the marketing blitz with SA services is it saves the customer time by pulling only 'relevant' information from the average 60 texts a day sent by teens, and ever increasing number of adult texts. Anyone who knows code knows a computer is blank until someone programs it—tells it to DO something. With sentiment software, 'notifications' of 'inappropriate behavior' becomes a construct of SA programmers, and the sample groups that define the classifiers. Clients can input keywords and other preferences such as number of alerts a day...etc, but the SA program has its own sets of keywords, and more troubling—NLP algorithms that analyze patterns of word usage to determine what should be flagged.


While SA programmers purport a 93% accuracy rate in the best scenarios, tests have been limited to small sample groups, and rely on few users to rate the success of assessment. It is beyond arrogance to assume we can quantify human emotions. Using SA software, Mom may find out that Mary "made it" with Max, or not, as Mary may be bragging about something that never happened, or "made it," could be holding hands, or completing a school project with him. The information the SA provides does not tell Mom why her daughter said what she did to her BFF, Shelly. Or even if it was the truth.


Your partner texting something sexual to someone other than you? The real time alert on your iPhone may reveal infidelity, but will not illustrate the communication gap between you two, that's clearly being ignored. 


And watch out for that 7% SA software is flat out wrong. Instead of investing the time to get to know her daughter, and sharing her child's life (so mom is the first one Mary comes to with big life events), mom will come home tonight with self-righteous indignation. She'll be raving angry at her daughter, shutting down communication, and putting more distance between them on an assumption made by SA software that is simply...stupid. The SaaS is unable to process complex human emotions and behaviors, like the lies we tell ourselves, and other, virtually daily.

Now, more than ever, with our world becoming smaller as different cultures blend, it is essential we communicate, talk to each other, look for common ground—the feelings all humans share, and should be sharing with each other. Our frailities unite us. Regardless where you are from, or who you sleep with, or what you believe, we all FEEL THE SAME THINGS: happy; sad; glad; mad...etc. These are core human feelings, that computers do not feel, and therefore SA software will always be assuming (ASSUMING often makes an "ASS" of "U" and "Me."), but never really understanding our behavior.


We are now privy to a spectacular array of communication tools with the potential to connect us all for greater understanding and tolerance. But SA software is counterproductive for open dialog at best, and fundamentally corrosive at worst. It is sure to infuse discord and distrust, much the way the internet is now viewed—isolating us, dividing us into segmented groups—when, at the net's inception, it was supposed to unite the planet. Even more destructive, SA software for SaaS products, like spying on our kids, or when used for marketing to influence elections by insighting ignorant, angry people to elect the second-coming of Hitler, has, and will continue to put even greater distance between us.

139 views0 comments

Recent Posts

See All
bottom of page