“Computer power is growing significantly, algorithms are becoming more sophisticated, and, perhaps most important of all, the world is generating vast quantities of fuel that powers AI—data.” [8. McKinsey]
Operationally, this is a huge task for buy-side organizations looking to invest in talent and technology, and to sift through what’s available today vs. tomorrow. For discretionary managers, this could mean significant capital—if not carefully incorporated.
But in new data types is the opportunity for firms to take in signals that might not have traditionally sat in their wheelhouse. Regardless of what type of shop you’re running—long equities, multi-strat, distressed assets, futures—there’s potential value in different sources and flavors. Finding sources that consistently outperform a benchmark means the difference between sunken information costs, and valuable information arbitrage.
Today, there’s a lot of existing source material (unstructured and structured) out there, including the current flavors in the Nasdaq Analytics Hub[9. The Nasdaq Analytics Hub launched May 2017—it derives daily buy and sell signals, and longer-term investment signals from a growing number of structured and unstructured data sources. The Hub leverages machine intelligence to derive additional signals proprietary to Nasdaq for each data source to discover what may or may not be knowable. Back-tested results of the signals in the Hub demonstrate benchmark-outperformance over multi-year periods.]:
• Twitter sentiment: digested terabytes of data into wisdom of the crowd—outperforming the Russell 3000 by 232% over a 4-year period
• Central banking communiqué: crunched speeches and documents for reserves around the world—outperforming the S&P 500 by 550% over a 14-year period
• Technical analysis: sector rotation and benchmarking from daily stock movements—outperforming the S&P 500 by 1611% over a 14-year period
• Premium Alpha Shorts: captured proprietary sentiment among retail traders—outperforming the HFRXEH by 336% over a 9-year period
• Multi-Expert Factors (Long/Short): technical and fundamental data working in concert for defined holding
periods—outperforming the Russell 1000 by 239% over a 14-year period
• Multi-Expert Factors (Long-Only): technical and fundamental data working in concert for defined holding periods—outperforming the Russell 1000 by 1,141% over a 14-year period
• Retail Trader Sentiment (Long/Short): aggregated retail trader predictions for defined holding periods—outperforming the HFRXEH by 410% over a 9-year period
• Corporate Filings: buried information in key disclosures and filings in an efficient and timely method—~70% of closed-signals and ~55% of open-signals outperforming the S&P 500 within fixed criteria
But beyond sourcing, turning these into signal on your own can be an expensive process. Once a dataset is in the door, firms will have to have the internal technical power to:
- Rank; and
- Incorporate the data.
All of which takes time, dollars, and bench to make the data meaningful.
Man with Machine
“Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?” –Douglas Hofstadter[10. The Atlantic]
Humans and machines both have their limitations. And while the human brain has incredible processing power, it simply cannot ingest and keep up—as worldwide data doubles year-over-year.[11. CNBC] Cognitive overload is real. And the machines are here to help.
Watson. Einstein. Alexa. Siri.
This is a new cast of characters driving valuations or increasing productivity for people; connecting the dots in your home or taking transcontinental infrastructure to the next level—they’re pushing the limits of what’s knowable, and making the knowable—actionable.
And for the buy side, the value of using similar artificial intelligence techniques (either through machine or deep learning) to cull through new and vast datasets will drive alpha beyond the traditional two-and-twenty business model—extracting intelligence through a combination of unique data sources, algorithmic iteration, and human talent.
Taking away the risk of cognitive overload is one thing, but to truly achieve a more informed, better-armed, operationally efficient alpha, firms must strike a balance between bodies and bytes—through the right data partner. To remove the barriers of sourcing, conversion, and continuous discovery, it’s key to look for a provider that can:
- Vet the dataset partners through rigorous research and standards.
- Validate the datasets themselves, through back-testing and machine learning to find where the signal is—and provide generic case studies on how a set might be valuable.
- Manage a growing roster of dataset partners to stay ahead of what’s knowable—and what’s meaningful—for firms of many shapes and sizes, with the ease of a single relationship and SLA.
In 1960, a headline in the New York Times read SPACEMAN IS SEEN AS MAN-MACHINE; Scientists Depict the Human Astronaut as Component of a Cyborg System.
While we aren’t there yet, technology will inevitability continue to expand our capabilities—as both individuals and groups.
But for groups in any vertical, what will be the cost?
The price of discovery, vetting, validation, and implementation will challenge firms—especially those under the pressure of shifting, outside forces like economies and client expectations. And the price of understanding what’s knowable, and how to make money from it, will become the defining feature of success for the buy side.
To learn more about new data sources, information arbitrage, and technology working at the service of human talent, visit business.nasdaq.com/hub or contact email@example.com.
To trial the Analytics Hub or become a dataset provider—contact DataSales@Nasdaq.com or visit business.nasdaq.com/hub.