Well, it depends on how you define "accuracy". A clock can only ever be accurate in its own reference frame. As soon as you reach outside of the local reference frame, though, there's nothing directly tying the ticking of this clock to any other. So while atomic clocks are great for knowing how much time has passed locally, they are (in and of themselves) generally pretty useless at knowing what time it is.
"What time it is" is effectively a fabrication. UTC (the most common version of "what time it is") combines the measurements of several hundred atomic clocks around the world to get an "official" time. Several hundred clocks that are all accurate to parts-per-billion, but all existing in different reference frames, and thus all ticking slightly differently. (And as a bonus, those reference frames change as materials deep in the earth move, underground water tables change, etc, so you can't even just program an offset into each clock so that everything lines up...)
GPS clocks are actually corrected. There's at least three different corrections and compensations going on:
- The clocks were configured so that they would run at the right speed in orbit, by making them run at the wrong speed on the ground (this is a compensation)
- GPS time is 'steered' towards UTC(USNO) to keep GPS time and UTC as close as possible (this is a correction)
- The GPS system announces that the time differential between UTC(USNO) and GPS time is, and how fast they are diverging -- this is the A0 and A1 parameters that caused the 13usec timing anomaly in January. (This is a compensation)
Anyhow, the best way to look at the long term 'accuracy' of an atomic clock is to consider the accuracy to be the amount of uncertainty existing in passage-of-time measurements in the clock's local reference frame. And that, in and of itself, has almost nothing to do with actually knowing what time it is.