Whether it’s bias,
disinformation, a joke, or just a misunderstanding, there are a lot of
falsities on the internet. Sifting through the crowded info ecosystem to find
truth is not always easy. And in an election season, it’s even harder!

According to tech expert Alex
Fink, if we can use AI to create things like deepfakes, we can (and should)
also use AI to spot and stop misinformation.

Alex Fink is a technical
executive and Silicon Valley Expat who founded Otherweb, a free platform that
anyone can use to analyze online items for validity.

Otherweb, (which will be
launching as an app for Android and iOS this Fall) uses a set of AI models that
analyze text along multiple dimensions – informativity, subjectivity,
formality, offensiveness, hatefulness, external references, source diversity,
clickbait headlines, use of known propaganda techniques, and more.

In an interview or article, Alex
Fink can share VALUABLE INFO & ADVICE about the information ecosystem and
how to mine it for truth.

His INSIGHTS & PRACTICAL TIPS
are applicable to all people (including techies, digital natives & novices,
as well as media creators & consumers!).

MORE ABOUT ALEX FINK:

Alex Fink is a Tech Executive,
Silicon Valley Expat, and the Founder and CEO of the Otherweb, a free platform
that allows users to read news and commentary, listen to podcasts and search
the web without paywalls, clickbait, ads, autoplaying videos, affiliate links,
or any other junk.

OtherWeb looks to help media
creators align their incentives to the incentives of their readers. It uses a
set of AI models (called “ValuRank”) that analyzes text along multiple
dimensions – informativity, subjectivity, formality, offensiveness,
hatefulness, external references, source diversity, clickbait headlines, use of
known propaganda techniques, and more. Readers can then use these scores to
customize their feeds, which results in higher-quality articles (that match the
users’ states preferences) are viewed by more people.

Leave a Reply

Your email address will not be published.