To overcome these challenges, in partnership with Google, NZME utilized Google Analytics 4 user data streamed into Google Cloud’s BigQuery to identify the user events most likely to indicate user interests and behaviours. The events were then used to train Machine Learning (ML) models which could predict which other content users might be interested in, as well as their propensity to subscribe or churn.
To enable low touch and efficient machine learning pipelines, BQML (BigQuery Machine Learning) models were utilised. BQML was chosen over other approaches as it enables data teams to run sophisticated ML algorithms from within BigQuery without the complexity and overhead of moving data into separate ML infrastructure. The machine learning pipeline was orchestrated using Cloud Composer which is a managed GCP service based on Apache Airflow. Recommendation model output was then cached in Redis, a high speed in-memory database platform which ensured minimal latency when returning content recommendations to the NZ Herald.
The propensity to subscribe and propensity to churn models were utilised by the NZ Herald subscriber marketing team as part of the data-led subscriber customer lifecycle program to deliver more relevant communications and experiences. Additionally, a content recommendation engine was developed to serve personalised content to a 'storycard’ on the NZ Herald homepage.
These personalised ‘storycards’ on the homepage improved the relevance of the content served to each user by serving free articles to non-subscribers and a mix of premium and free articles to premium subscribers, with content preferences based on reading history.