How can we improve our conversion rate going forward? The question of the decade for marketing analysts, and I do not view your leadership team changing their interest or focus in this topic any time soon.
You can report, slice, dice, and segment away in your analytics platform, but needles in haystacks are not easily discovered unless we adapt. I know change can be difficult, but allow me to make the case for machine learning and hyperparameters within the discipline of customer analytics. A trendy subject for some, a scary subject for others, but my intent is to lend a practitioner's viewpoint.
To create a good machine learning or predictive model, many choices have to be made when deciding on algorithms and their parameters. The usual approach is to apply trial-and-error methods to find the optimal algorithms for the problem at hand. Often, an analyst will choose algorithms based on practical experience and personal preferences. This is reasonable, because usually there is no unique solution to create a machine learning model. Many algorithms have been developed to automate manual and tedious steps of the analytical process. Still, it requires a lot of time and effort to build a machine learning model with trustworthy results.
A large portion of this manual work relates to finding the optimal set of hyperparameters for a chosen modeling algorithm. Hyperparameters are the properties that define the model applied to a data set for automated information extraction.
In applications for marketing and customer analytics, an analyst must make many decisions during the training process. A large portion of the model building process is taken up by experiments to identify the optimal set of parameters for the algorithm. As algorithms get more complex (single-layer to multi-layer neural networks, decision trees to forests and gradient boosting), the amount of time required to identify these parameters grows.
There are several ways to support analysts in the cumbersome work of tuning model parameters. These approaches are called hyperparameter optimization, or auto tuning. Not only do ideal settings for the hyperparameters dictate the performance of the model’s training process, but more importantly they govern the quality of the resulting model.
In general, there are three different types of auto tuning methods: parameter sweep, random search, and parameter optimization.
SAS provides analysts a hybrid, derivative-free optimization framework that operates in a parallel and distributed computing environment to overcome the challenges and computational expense of hyperparameter optimization. It consists of an extendable suite of search methods.
With that said, I invite you to view a video and technology demonstration that will address the following topics:
Learn more about how the SAS platform can be applied for marketing data management here.
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning and boost your career prospects.