Machines understanding us
At the same time this page is being created and this line of text is being written, 18.1 billion messages have been sent, 55,140 photos have been posted on Instagram, 4.5 million videos are being played on YouTube, Google is processing 4,497,420 searches and these are just a few examples of how data generation is unstoppable.
Every day and every moment we are producing data: at what time we set the alarm, how many times we postpone it, what’s the first application we open on our cell phones, to whom we send or from whom we receive the first WhatsApp message, what social network we open, what’s the profile that caught our attention the most, how many likes we give, to whom, even why, what pages we visit, how long it takes us to read an article, in short, I’ve only described the first ten minutes of any ordinary human being, so you can now imagine the amount of information we generate over the course of a whole day.
In the last five years we have generated 90% of the documented data worldwide, and this is not surprising because since the discovery of the computer processor in 1970 by Intel, the human being has always sought to improve that capacity, making it faster, more powerful and with lower production costs. This is, in a nutshell, the explanation of how Gordon Moore with his empirical theory predicted that every two years (actually it is 18 months) the computing capacity would double, and the production costs would get cheaper and cheaper. However, during the last few years we have reached a level where we are challenging physics, therefore other mechanisms have been sought, such as quantum computing, which promises and evidences to increase the computing capacity in an exponential way to what we know today, although for now this power is of exclusive use of the giants of technology due to its price, complexity and continuous development.
Data allows us to convert and transform valuable information into raw material, however the tools to explore it go beyond Excel or simply a relational database -although these are very valid and powerful tools, but if we want to find the real value and go deeper into the data, we must make use of data science tools such as R, Python, Scala, among others and even think a little further than sums, averages and pivot tables. Here the support of descriptive statistics, probabilistic statistics, algebra, linear algebra, among others, takes more relevance, and at this time the samples do not represent hypotheses because we are in an era where people want and need to feel treated as unique.
Now to add another ingredient to this cocktail, according to Forbes, 90% of the data that is generated in the world is unstructured information, such as calls, chats, photographs and videos, this makes its analysis even more complex because if 2 + 2 = 4 in structured data, where each data attribute is numerical. in a chat or a call “two plus two” is not necessarily equal to four because a context is required to argue this claim. This is why natural language processing and understanding (NLP and NLU) are born.
In Ressolve we apply these techniques by teaching a machine that has the capacity to process large volumes of audio and text, which is programmed with mathematical algorithms that have the ability to listen to spoken conversations and convert them to text and give numerical assignment to the texts. It is also trained by people who study social and linguistic sciences to understand dialects, idioms, sarcasm, cultural particularities, among other uses we give to our language.
With this formula, Ressolve has the capacity to interpret large volumes of information in a very short time, with objective appreciations and allowing companies to have a more detailed knowledge and understanding of their clients, collaborators and users, because at this time we must think beyond just knowing people by their gender, age range and use of products or services, which although it is important information, does not necessarily describe their needs, dreams and interests; we must listen to them, understand them and create proactive empathy that allows us to deliver and receive value, growth and improve the life quality.
Ressolve is the artificial intelligence trained to analyze and interpret conversations in any contact channel, be it calls, chats, e-mails, among others, in order to extract knowledge that provides value to your clients, collaborators, users and shareholders.