A post I wrote on the LinkedIn Engineering Blog that documents my team’s work building Unversity Pages for LinkedIn. This post surveys the engineering and data pipelines that goes into a modern LinkedIn product like University Pages.
- Data: we built data standardization, auto-follows, similar schools, and notable alumni calculations using Hadoop, Azkaban, and Pig scripts.
- Storage: we modeled school data as a graph and stored it within Espresso DB.
- Search: we built search indices and search services on top of this data using Bobo and Zoie.
- REST: we exposed this data through RESTful APIs built on top of Rest.li.
- Feed: the school status updates feed is powered by our content serving platform (USCP).
- UI: we send this data to the browser as JSON and render it client side with dust.js.