Data, over time, creates trails from which narratives emerge. When those trails tell a story that begs to be told, we're there to report it. This, of course, can be a dicey business, especially when you don't show your work or make clear from where you got your numbers. And in a world where it's important to question facts, data becomes a storytelling goldmine.

We live in that goldmine here at Thinknum Media.

Data is the matrix on which we all live. When you see, digest it, and use it to tell a story or support an argument, you're doing so armed with facts. Speaking of The Matrix, after Neo was given a data download of martial arts in the classic sci-fi movie, he simply proclaimed, "I know kung fu."

Let's say we teach data kung fu here at Thinknum Media. We scrape publicly-available data sources, look for trends, and tell the stories. But when you obfuscate data and use it for your own designs beyond sharing some facts, you undermine trust. That's why, in every story we report at Thinknum Media, we invite readers behind the curtain, to see the data first-hand. Our process is simple: we start with numbers rather than suppositions.

This weekend saw heated discourse on privacy and user data as Cambridge Analytica was discovered to be selling Facebook user data to those who desired to shape political opinions on the world's largest social media network. How'd they do it? They created a personality quiz called MyPersonality that not only asked Facebook users to log in via their credentials, but it also collected very personal psychometrics about them. If you've used Facebook in the past few years, you've probably been invited to take a personality quiz - it's a thing.

The MyPersonality facebook app went viral, and the person behind it suddenly had valuable troves of personality profiles in his hands. He said he was keeping it for academic research, but what he did with it was entirely different - and ultimately disturbing.

According to the Wall Street Journal, Cambridge Analytica bought the user data from Aleksandr Kogan, a Russian-American researcher and psychology professor. Mr. Kogan, funded by Cambridge Analytica to the tune of more than $800,000, created a personality quiz app for Facebook. This app, like any Facebook app, asked users to opt-in to share their personal data. That, combined with those users' answers to personality metrics, allowed the app's designer to collect and then shape the data into user matrices based on psychographics like openness, conscientiousness, extroversion, and more.

Some of the many personality quizzes on Facebook

And who bought it? A wealthy republican donor, Robert Mercer, who was working with political advisor Stephen K. Bannon. They were promised by Cambridge Analytica, that, with the data, Mercer and his clients could target particular American voter personality types and ultimately influence their behavior during elections.

As to whether or not it worked, that's a question for another time.

Targeting messages with user data is nothing new. "Programmatic advertising" has turned the marketing industry into a massive tech play that creates firehoses of money streaming into big players like Google, LinkedIn, and, yes, Facebook. The better the user data, the more accurately marketers can target people, the more successful the platform.

It's argued that this is a good thing, because users only see messaging that's relevant to them. Gone are the days of 30-second spots that mean nothing to you: the future is commercials that seem to know you better than you know yourself, for better or worse.

It would be fair to say that in many cases, user data is the currency that pushes technology innovation - whether it's for finance, media, marketing, or, in the case of Cambridge Analytica, politics. Virtually all technology companies - at least successful ones - act on the data they mine from users.

But what's different about the data that Cambridge Analytica used is how it was acquired it and what its reapers did with it. 

For our purposes, let's look at the facts - or the data, if you will: By handing the data over to Cambridge Analytica, Facebook argues, Mr. Kogan violated the platform's privacy rules. That's because of the millions of profiles he handed over, only 270,000 of them consented to having their data scraped.

We occasionally do stories about Facebook activity data here at Thinknum Media. In those, we look only at data that Facebook makes public - none of it links back to individuals. But for our purposes, that data is fascinating enough for us to tell some great stories: For example, which Facebook apps are being used most, what people are talking about, and where users are checking in. This allows us look for trends about business - real world and virtual - and tell those stories.

Stories like these:

The one thing all those stories have in common: They're not targeted at individuals, just trends.

In fact, anyone can verify the data we report at Thinknum Media, and we like it that way. It creates an atmosphere of openness and, most importantly, curiosity. If you as a reader want to look deeper, we invite you to do so. We have nothing to hide.

Instead, by showing you the data connections we're making, we're hoping you'll make some of your own. It's sort of like kung fu, if you will.

Ad placeholder