# Python Matplotlib等高线图中的非线性缩放

Now, I have thought about what else to do. Of course there is logarithmic scaling, but then I first need to think about some sort of mapping, and I am not 100% sure how one would do that. Inspired by this question one could think of a mapping of the type `scaling(x) = Log(x/min)/Log(max/min)` which worked reasonably well in that question.

Also interesting was the followup discussed here.

where they used some sort of `ArcSinh` scaling function. That seemed to enlarge the small features quite well, proportionally to the whole.

So my question is two fold in a way I suppose.

1. How would one scale the data in my contour plot in such a way that the small amplitude features do not get blown away by the outliers?

2. Would you do it using either of the methods mentioned above, or using something completely different?

I am rather new to python and I am constantly amazed by all the things that are already out there, so I am sure there might be a built in way that is better than anything I mentioned above.

For completeness I uploaded the datafile here (the upload site is robustfiles.com, which a quick google search told me is a trustworthy website to share things like these)

I plotted the above with

``````data = np.load("D:\SavedData\ThreeQubitRess44SpecHighResNormalFreqs.npy")

fig, (ax1) = plt.subplots(1,figsize=(16,16))
cs = ax1.contourf(X, Y, data, 210, alpha=1,cmap='jet')
fig.colorbar(cs, ax=ax1, shrink=0.9)
ax1.set_title("Freq vs B")
ax1.set_ylabel('Frequency (GHz)'); ax1.set_xlabel('B (arb.)')
``````
Oliver W.

Excellent question.

Don't scale the data. You'll be looking for compromises in ranges with many scaling functions.

0条评论