Rise of big data – Increases in storage and the power of computing
Big data has become a hot topic in recent years (a Google search for “big data” yields 71 million results). The phrase refers to masses of data that traditional software and tools are ill-equipped to process. Much of these data come in the form of unstructured data – things like text, photos and videos that aren’t neatly arranged in rows and columns or traditional relational databases.
Broad use of the internet, low-cost sensors and the rise of social media have combined to democratize content generation. At the same time, the costs of computing and storing these data have fallen significantly in the past few decades, with hard drive storage prices that were well over $100k per gigabyte in the 1980s now going for less than 10¢ per GB today. The result is that we are in an unprecedented era of data generation, with far more data created in the last decade than in the rest of human history.
Unstructured data formats make use difficult
The fact that much of these data are in an unstructured format means that the majority goes un- or under-utilized, with vast volumes of hidden insights for individuals, businesses and governments that are yet to be revealed. Numerous tech companies have identified the huge opportunity at hand, vying to be the go-to resource for those interested in unlocking those insights that lurk in data warehouses.
Natural language processing using IBM Watson® technology unlocks data, making it actionable
IBM’s Watson has emerged a frontrunner in this space, with early publicity coming from its Jeopardy win over past champions Ken Jennings and Brad Rutter. The computer used its natural language processing algorithms to rapidly understand the questions being asked and reason through potential answers before delivering a response. It’s likely that Watson would perform far better if asked to complete a similar task today; machine learning techniques mean that humans are able to use an algorithm’s past mistakes to teach the computer and thus improve future performance.
As a Watson Ecosystem partner, Housing Tides uses Watson technology to filter and understand the sentiment of over 500 housing news articles per month while identifying relevant forecasts offered by industry professionals. We believe that – by adding structure to the inherently unstructured form of news articles – there’s actionable information to be had in these textual data.
Hedge funds and traders are using sentiment analysis of news and social media platforms when making investment decisions
Hedge funds and money managers tend to agree. More and more, they’re leveraging these new technologies in an effort to gain a competitive advantage over their peers, using entity recognition and sentiment analysis tools to assess news content and Twitter posts in the search for equity price signals. Time is of the essence when trading stocks, and computers are able to read and derive meaning from thousands of articles or tweets in the amount of time that a human would take to read this blog. A rapid rise or fall in the sentiment towards a given company gives traders advance information that can inform long or short positions before retail investors catch on to latest developments.
Nowcasting using news topics
Similarly, in 2016 economist Leif Anders Thorsrud published an article titled “Nowcasting using news topics: Big Data versus big bank” wherein Dr. Thorsrud demonstrated that by parsing textual data from a major newspaper to identify the frequency of various news topics he could improve upon the gross domestic product (GDP) forecasts offered by central banks. He writes that the “agents in the economy use a plethora of high frequency information, including media, to guide their actions and thereby shape aggregate economic fluctuations.” Keeping in mind that economic outcomes are the aggregation of the decisions of individuals with limited knowledge, and that much of this knowledge is obtained through major media channels, we shouldn’t underestimate the effects that media coverage has on the housing market.
The Housing Tides news analysis was created to help improve housing market investment decisions
We argue that an approach combining these techniques is the most appropriate because it would deliver the highest-quality actionable information to industry stakeholders. Specifically, we believe that timely knowledge of the frequency, content and sentiment of housing news articles can meaningfully improve decision-making when considered jointly with analysis of relevant economic fundamentals, such as those included in the Housing Tides Index™.
Reading every piece of housing news in a given month is like drinking from a fire hose. For busy professionals the time required to read and digest hundreds of articles each month prohibits an understanding of the consensus view about this industry. By using IBM Watson cognitive computing technology, we can better amplify signal and reduce noise, supporting the human knack for pattern recognition. With these tools it’s never been easier to quickly distill volumes of unstructured textual data into meaning and put that meaning in context – context that’s essential for good strategy and decision-making.
Share this Post