The irony is the vast majority of data science folks use median far too often. I've seen so many times where a model uses median instead of mean "to eliminate outliers". But if the distribution isn't normal and if the outliers are actual data points rather than errors, eliminating those can be disastrous for model accuracy.
Like this is a dumb example but say you're building a model for the lottery on how much to pay out. Every ticket you sell nets you $1, and you check the median of the payout, you wouldn't want to use the mean and capture outliers like winners would you? You get $0. Sweet you're making 100% profit no matter how much you pay out to the winner.
A more realistic example is say you're a landlord building a model on how much rent to charge. If you look at median costs and revenues you'll vastly understate how much you need to charge. Because while the vast majority of tenants will pay on time and not wreck the place, the times you get tenants who you have to evict and pay 5-6 figures in repairs will bankrupt you if you're not charging more on median cases to make up for those outliers.