Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:Hmmm, not immediately obvious from the paper (Score 1) 68

But even then 2020 was roughly half way from 2007 to the 2012 minimum, and it's likely to be another 15 years plus (2027-2030) before we expect to see 2012 minimum challenged

Ouch. Teach me to not proof read carefully.

Of course, that 15 years plus should be from 2020, not 2012, i.e. it will be significant if the 2012 record isn't approximately matched by unexceptional melt by about 2033-2036.

Comment Re:Hmmm, not immediately obvious from the paper (Score 2) 68

I've always got it. But you still haven't got it.

Their null hypothesis is that there's no trend, and picking a very short timescale they discover they cannot reject it. Woopie doo. Pick a short enough time scale and that is always going to be true unless there's no random variability at all.

So we first need to pick a reasonable null hypothesis. One reasonable null hypothesis is that there's been no change in trend in the entire satellite record and we find we cannot reject that either. And in my books, a 45 year trend beats a 20 year trend.

So their claim that there's no trend in the last 20 years is an "extraordinary claim" and so requires "extraordinary evidence" of which there's absolutely none in their paper at all.

They need to a) show that the decline 1979-2005 is much steeper than the full trend, b) show that the decline 2005-2025 is much less steep than the full trend and c) explain some physical reason why there was a step change in 2005. Needless to say, they won't be able to do any of those.

And for the avoidance of doubt, there is nothing in the historical record so far that allows us to reject the hypothesis that there's a linear trend in sea-ice decline. Intuitively it seems fairly obvious that at some point that has to be wrong - not least once we get to a blue ocean event in summer the trend has to stop by definition but even before then it seems likely that the trend will break down for various physical reasons.

2012 was an exceptionally low year for sea-ice. Every weather effect conspired that year to cause the sea ice to reach a minimum. Earlier in the summer the ice was dispersed due to the weather allowing lots of melt in the more southerly arctic waters. Late in the summer that reversed leading to compaction and a dramatic reduction in ice extent and area as the previously dispersed ice piled up near the pole.

2007 was another low year, not an extreme event like 2012 but well below trend. 2007-2024 is 18 years, and the lowest 18 years on record are 2007-2024.

The 2025 winter maximum was the lowest on record. The trend for winter sea ice is much smaller than for summer sea ice, much of the arctic freezes every year and will continue to do so for a very long time. There is much less multi-year ice than there was 30 year ago and almost no ice that has survived 5 or more summers.

We don't have the 2025 summer minimum yet but it's likely to be a top 10 year (where top is bad/low sea ice) We've got maybe a month of melting still to go. For much of the summer, like many years, the sea ice tracked or was below 2012 levels, but the GAC (great arctic cyclone) of 2012 that caused ice area/extent to drop and keep droping right up to mid September has not recurred in any year.

It took best part of a decade for the trend to catch up with the 2007 minimum, and a decade and a half before the 2007 minimum was comprehensively smashed (with the exception of 2012). But even then 2020 was roughly half way from 2007 to the 2012 minimum, and it's likely to be another 15 years plus (2027-2030) before we expect to see 2012 minimum challenged unless we have another major weather event leading to unusually low ice.

Comment Re:Hmmm, not immediately obvious from the paper (Score 2) 68

I'm failing to see what you're seeing. As I quoted above:

In my skim of the paper I saw:
The trend of September Arctic sea ice extent for the most recent two decades 2005Ã"2024 is Ã'0.35 and Ã'0.29 million per decade according to the NSIDC and OSISAF sea ice indices respectively (Figures 1a and 1b). The key point, we emphasize, is that these trends are not statistically significantly different from zero at a 95% confidence level.

But in case you missed it, let me quote the entire paragraph:
The main approach for analyzing simulated changes in Arctic sea ice cover is to compute the linear trend for the 20-year period 2005â"2024 for each individual member available for each model, as motivated by the observed changes (Section 3.1). This gives a range of 10â"100 members to examine the spread of simulated trends for each model and scenario. The main definition of pause used in this study is motivated by the observed 2005â"2024 September sea ice extent trends ( million /dec, taking the most conservative estimate from observations). We also use an alternative definitionâ"trends which are not statistically significant at the 95% confidence levelâ"to ensure that this specific observed threshold does not overly influence the results. This secondary definition contains information about the signal-to-noise ratio, and so is complementary to the trend threshold definition. However, we find that both definitions produce consistent results. When we report multi-model averages, we do so by using a square-root weighting scheme to take account of the number of members in each ensemble (see Supporting Information S1 for a detailed explanation).

Clearly here they're looking at the 2005-2024 range. There's nothing fundamentally wrong with doing that but it becomes difficult to tell if you're making up phantoms.

Very crudely, I'd guess that the zero trend line there has a value of 4.5Mkm^2. Maybe 5 on OSISAF. With me so far?

I can't be arsed to do the analysis but my guess on the trend line 1979-2005 would put the 2005 end point on or above 6Mkm^2.

So they've got a discontinuity in their analysis at their change point of at least the decadal loss, probably significantly more. And then they're claiming that there's been a pause?

https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.realclimate.org%2Fin...
from well over a decade ago talks about the same issue with people "analysing" temperature.

Comment Re:Hmmm, not immediately obvious from the paper (Score 1) 68

No it doesn't show that. Definitely not by eye.

You do NOT see a change in trend by drawing a line between points A to B and B to C and declare that implies there's no trend in sea ice decline over the last 20 years.

You have to show that there's no possible trend from A to C that is consistent with the trends from A to B and B to C. (You also can't have a discontinuity at B which is another thing that doing it by eye can mislead)

My by eye look suggests that there's no significant change in change over the satellite record.

If there are statistically significant trend changes in that record (and I doubt that there are), then I'd expect two change points, one around 2000 and one around 2010. The pre 2000 trend and post 2010 trends being similar with a much steeper decline during the 2000 decade.

I wouldn't put money on it, I don't do this sort of thing often enough to be confident in my "guess by eye" but I'd be looking for a mistake in someones formal analysis that suggests that there's a statistically significant change in the trend in the current record at any point.

Comment Re:Hmmm, not immediately obvious from the paper (Score 3, Informative) 68

Sorry, but I don't know what point you're trying to make. Your kind link to climate.nasa.gov doesn't tell me anything I didn't already know.

The headline says "Dramatic Slowdown in Melting of Arctic Sea Ice Surprises Scientists", but I cannot see anything in the paper that suggests that there's actually a statistically significant slow down at all nor that it has surprised scientists.

The title of the paper is:
Minimal Arctic Sea Ice Loss in the Last 20 Years, Consistent With Internal Climate Variability

In my skim of the paper I saw:
The trend of September Arctic sea ice extent for the most recent two decades 2005â"2024 is â'0.35 and â'0.29 million per decade according to the NSIDC and OSISAF sea ice indices respectively (Figures 1a and 1b). The key point, we emphasize, is that these trends are not statistically significantly different from zero at a 95% confidence level.

So, unless there's something in the paper that I've missed, I'm not convinced that they've established that there's a pause at all. And the title of their letter even implies it! At best they have established that there might be a pause and that the measured *decline* over the last two decades is due to natural variability.

We need change point analysis to establish if there's been a statistically significant change in the rate of sea ice melting over the last two decades compared to the previous 2 1/2

Comment Hmmm, not immediately obvious from the paper (Score 5, Interesting) 68

Where's Tamino when you need him?

I've not read the paper, just scanned it for keywords but my first glance doesn't fill me with confidence that the headline matches the results.

They say that there's a decline for the last two decades 2005-2024 and that that's statistically indistinguishable from zero. Fair enough. But it's clearly also statistically indistinguishable from a larger decline.

So it's not completely obvious to me that there's been a change in the rate of sea level decline, just that the earlier decades might have been a bit high and regression to the mean giving a smaller rate of decline than statistics says is happening.

Need some change point analysis to tell if there's anything significant happening and I couldn't see anything in the article that suggested that this had been done.

Comment Re:m/s does not mean miles per second! (Score 1) 28

The summary says:

For the first time ever, a CCTV camera in Myanmar captured real-time footage of a supershear strike-slip earthquake moving at 3.7 miles per second.

There is no "realtime footage of a supershear moving at 3.7 miles per second." The horizon in the video is at most a few hundred metres away. The entire "realtime footage" is of the ground moving at about 3m/s.

Comment Re:m/s does not mean miles per second! (Score 2) 28

I think the 3.7 miles per second is the speed at which the earthquake travelled along the ground from the epicentre. At 50 frames per second, it's travelling 120m per CCTV frame which means there's no recording of the rupture propogating, only of the 1.8 seconds that it took for the ground to move around 3 metres.

Comment Re:Retiring from IT in general seems like a good i (Score 4, Informative) 85

I can't say there's a single thing I enjoy about any of this any more. I think I understand why so many IT people retire early and take up farming.

I suspect it's going to be important but I wonder if these companies that are demanding it are being premature.

I remember when syntax highlighting came in. For a good while I disabled it as I found it distracting but over time I got used to it and now I prefer to have it on (although I'm still not greatly inconvenienced if I don't have colour)

I remember when LSPs became available. For a while I didn't bother but now I find it hard to work without one configured (in vim) although I do still resort to grep/sed etc, especially where the LSP doesn't work in some C++ where type deduction is hard.

And now LLMs. I use them via a web prompt and they can save considerable time where I'm looking something up. But they've also cost me considerable time where I've relied on the answer which, once I actually spent the five minutes reading the (long) manpage, was so obviously wrong that I'd be concerned about any programmers long term success and I'd certainly be trying to limit their access to any code that I'm responsible for!

I think more than anything it's this utterly absurd wrongness that they do with such verisimilitude that make them dangerous currently.

does unlink in debugfs update the free block list?

Yes, when you use unlink on a file in debugfs (or via any means in the Linux filesystem stack), it does eventually update the free block list -- but not immediately at the debugfs tool level unless explicitly committed or updated.

In debugfs specifically:

        debugfs is a low-level filesystem debugger. It does not mount the filesystem or execute actions in real time the way the kernel does during normal operations.

        If you run the unlink command in debugfs, it removes the directory entry, and depending on the inode's link count, it may also release the inode.

        But to update the free block list, debugfs must explicitly free the inode and associated blocks.

In recent versions of debugfs:

        unlink can update the block and inode bitmaps if used correctly and if the file has no other hard links.

        But sometimes manual intervention (e.g., using clri to clear inodes or freeb to free blocks) is necessary in corrupted filesystems or after a crash.

Summary:

        Yes, unlink in debugfs can update the free block list.

        But this happens only if the file is fully unlinked (no hard links remain).

        Remember that debugfs works outside the kernel, so changes are made directly to disk structures -- use with caution.

But:

man debugfs

              unlink pathname
                            Remove the link specified by pathname to an inode. Note this does not adjust the inode reference counts.

Comment Surely copilot should just fix the issue? (Score 1) 47

I really don't get it. Isn't AI supposed to save everybody time? Someone has gone to the trouble of describing their issue to AI in sufficient detail that the AI can create an issue for a human to understand it and do the work, so why doesn't AI just create the pull request?

Even better, if the project maintainer doesn't like it, the requester can for k the project and have their feature/bugfix/whatever and the world is a better place.

When I create a bug (even a wishlist bug) I often/usually have a patch too, and I'm usually running with that patch already because the reason I spent the effort on the bug is because I needed it fixed.

Comment Handheld advertising delivery device (Score 4, Informative) 70

Sadly, this is probably true. I still use my phone as a communications device but it's slowly getting harder and harder to use like that. Google (and Apple although I've not used it much other than to try when Google first broke always on VPN and discovered that Apple has the same problem without the option of a custom rom) are focused on tracking and advertisment delivery (AKA money extraction).

I use always on VPN (to a private server), that way there's no information in my IP about where I am, everything comes from a handful of IPs. Sure, websites etc can (probably) know it's me each time but the don't get any information about where I might be.

But both apple and google have (IMO deliberately) broken this. I don't know of any way to have always on VPN working on Apple and receive things like whatsapp because the phone going to sleep disables the vpn. On Android it's still possible[1] on a rooted device with a custom rom but you do have to accept that a significant proportion of apps won't work on a rooted phone - so you end up needing a second phone to use the first phone as a hotspot.

[1] I suspect that one day Whatsapp will refuse to run on a rooted phone.

Comment Sort of like ADRs then? (Score 2) 17

So this lets non-US investors access US stocks.

And when the orange menace decides he doesn't like it and forcibly converts them to the underlying stock but doesn't allow the investor to sell their holding in the US they're stuffed, much like happened to ADR holders of Russian equities when Putin invaded Ukraine and Putin decided to require ADRs to be delisted and converted to the underlying share.

Oh, this is crypto, there is no "underlying share", you just think there is.

Comment Re:And yet... (Score 4, Interesting) 164

I work for a (regulated) american company but am based in the EU.

One of the problem is the different way regulations are written and enforced.

In America the regulator comes in, says "You need to do X, Y, Z to get into compliance". We do X, Y, Z and next year the regular is "all good".

In the EU the regulator comes in and says "These things aren't right. X, Y, Z must to be done". We do X, Y, Z and then next year the regulator is "This isn't good enough, That previous report wasn't a list of things to do, it was a list of things that definitely weren't right".

This makes it much harder, and much more expensive to comply. Not least because there's a constant conflict between "is this enough?" and "should we spend more money to do more?".

We now have a good relationship with the regulator but it was a painful few years.

I don't know much about the asia side but I think the regulations are more strict, easy to accidentally violate when a bug gets introduced, but easier to understand and know when you are and aren't in compliance - therefore at least in our company they're considered easier and cheaper to deal with.

Slashdot Top Deals

Basic unit of Laryngitis = The Hoarsepower

Working...