Hello everyone,
Context: I’m trying to shade data to show regions with a standard deviation between X values. I’ve tried using the following code, however when I then go back to shade x[l=x:x@std] nothing changes.
yes? let mask = if mld[l=1:120@std] lt 30 gt 80 then 1000 else mld[l=1:120@std]
yes? set var/bad = 1000 mld[l=1:120@std]
yes? shade mld[l=1:120@std] !the normal unmasked map appears...
So with that code I’m trying to describe to ferret that if the standard deviation of that time period lies outside of 30 and 80 then it should ignore it. The outcome I am after is the plot only shading the area that fits within those bounds (with a std of between 30 and 80). Any Idea’s as to why this isn’t working? Apologies if this is simple but I’ve been stuck on this for a while as a new user!!
Many thanks,
Josh
Sent from Mail for Windows 10