This not another argument between Frequentists and Bayesians! Practical significance is more than analysis method or techniques. Practical significance is the hard collaborative work to create a bridge between data and the real world of operations, customers, marketing, advertising and customer service. We need one bridge at the point of measurement and another at the point of customer communications and transactions. Practical significance is found in the answers to the "So what?" and "Who cares?" questions. Statistical significance, whatever the methodology, is a good start but insufficient. Determining practical significance is a hard collaborative task not a solitary engagement between analyst and computer.
When data is abundant, and it often is in this day of "big data," it is tempting to just jump straight into analysis. Instead I suggest starting small and getting intimate with the "reality" that is being measured. If possible understand what is actually being captured and what is the context? Ask yourself what else is going on this context? What are we not measuring? How does what we measure give us hints about the unmeasured elements of the process? Once you understand the measurement context follow your data chain all the way through the systems. Where does the data get cleaned, transformed and summarized along the way? Do I still trust the data that I see at the end of the process? At which points does data from distinct silos get combined? Are the units of analysis comparable? Practical significance at this point in the process is answering questions like what are we actually measuring? Are we capturing that transaction or behavior reliably and comprehensively?
Get out from behind your computer and go talk to employees in the factory, warehouse or store. Observe and listen to their interpretation of what is being done. You might find out that some data that you want to rely on is entered at random by people working on the factory floor because the system won't let them proceed to the next screen without a value in that field. That data may seem meaningless to a front-line worker and so they just enter a random number instead of a measured value. Capture the stories behind the data. You might have a neat little theory to explain patterns seen in customer data but stories and sense-making by prospects and customers may leave you with a very different interpretation of the same data. Add some "flesh to the bones" of your data skeleton.
I often find it useful to share preliminary models or even simple bivariate relationships with people immersed in the business operations. While sometimes our interpretations converge, there are other times alternate hypothesis emerge and some nuance of the business situation is made clear as we work through possible explanations for the observed relationship. Alternative hypotheses often generate an additional rounds of analysis and interpretation. If possible don't interpret on your own. Early involvement by business leaders increases buy-in for subsequent actions.
A key driver of practical significance is whether the operational process or customer interaction has an easily accessible action point. On occasion I have been frustrated to find a powerful and seemingly actionable relationship in our data only to later discover that the action point is controlled by a partner and the partner is not inclined to act. For a research finding to have practical significance it must enable differentiated communication or action to trigger differentiated outcomes.
Calculating believable Return-on-Investment(ROI) is a joint venture between accounting, business operations, sales, marketing and IT. For instance, I once discovered through data analysis that my client was leaving a fairly significant amount of money on the table by not segmenting their product-set by another factor. After walking through the analysis with the business partner we started doing a rough ROI by calculating the cost side of the equation. As we dug into factors such as changes in training, tripling the number of SKUs to track and reorganizing the warehouse we soon realized that the ROI simply was not worth the cost and risks. Conversely we discovered another, much smaller optimization, opportunity that we almost passed by. However, the cost and complexity of making the changes were so small that implementing it made sense.
In addition to the direct complexity cost which should be accounted for in your ROI calculation, there is also cumulative complexity overhead. The first addition to overall complexity may seem trivial but what about the twentieth or the hundredth adjustment to the underlying system logic? Each refinement of the model and parameter added may well imply another fork in system logic, another wording variation in customer communications, another personalization of offers or differentiation of service. No one of these may be overwhelming on their own and yet the cumulative rise in complexity means that it becomes increasingly difficult for us to track and maintain the underlying system logic in all the inter-dependent systems. Don't forget to keep a constraint on the overall level of system complexity!
Doing the hard collaborative work of practical significance will serve you well in the end. When done well this collaborative determination of significance ensures shared responsibility for execution. Also it keeps data scientists central and relevant rather than marginal.