Professional broadcasts use something called "time code". Time Code Generators work along with a sync generator to make sure that every frame that a television truck creates gets created at precisely the same moment, and gets tagged with a unique and sequential timecode. The better ones sync with GPS or a modem to give precision time, the cheaper ones you have to manually set - usually with a cell call to the atomic clock. We embed that timecode signal into the ancillary data of each video frame, and produce a side-channel audio signal with the data embedded for devices that can't accurately read the timecode from within the video frames. We have to be incredibly accurate - as a video frame that is a fraction of a microsecond out of time can cause all sorts of issues. This used to be a huge deal in analog standard-def video - any frame that was out of time could cause the picture to shift horizontally or vertically (think of bad tracking on VHS tapes as a small example). Even sync was less accurate - sync was delivered through "black burst" which was literally just that - a burst of a black video signal, where you took the sync pulse and lined it up to ensure the timing of the frame was accurate or "close enough". Now with HD, we use tri-level sync which is way more accurate.
For the TL;DR crowd, in a production truck we can make precision measurements of time based on our sync and time code. The company that has created a good percentage of the official replay systems in the US - DVSport - has no access to our sync or time code. They also take our uncompressed frames and compress them into a video stream. They generate something loosely akin to our time code, but really it's just a reference point to where in the compressed stream you'd like to view.
Because of the inherent inaccuracies of how they time tag their compressed video and the inaccuracies of their internal clock itself, their time code - even when properly set - can "float". The longer you record, the more float you get - and it's not unusual to see minutes of float in a day. But if your internal source clock is inaccurate - and your math is trying to divide a second into the wrong number of frames - you get issues like this. You get severe time code float with 60fps vs 59.94fps alone, and that's BEFORE considering how accurate your reference clock is, and without any regard to how accurate your MPEG video encoder is. People are speculating that the software didn't know the video source was 59,94fps and was doing math based on 29.97fps or 30fps.
Even in the professional world we get tiny bits of "float", but ours is typically only a frame or two per day. We also issue what's called a "time code jam" - where we issue a uniform break in the time code stream to make sure every device is still synchronized to each other without falling too far behind actual time of day. These cheaper replay devices don't come anywhere near that level of accuracy.
Now imagine loosely time-tagging video into a compressed stream, and taking that wholly inaccurate time and reattaching it to video frames that are being uncompressed by an MPEG decoder on-the-fly. And now you can see how accurate relying on a replay system time overlay is. Prosumer video products like DVSport don't hold a candle to the timing standards we use in professional television production. Not that they can't - they just don't. More than likely because it's never become an issue up until now - or worked "well enough" for no one to notice. That is, until something as big as the outcome of a game relies on your kludge of a modestly-accurate timing reference.
We, the people who work on the technical back-end to create the HD broadcasts you watch, are fighting a never-ending battle with crappy, hastily-written software that can't tell the difference between 30fps and 60fps.
Professional video gear that costs tens of thousands of dollars per unit still have software settings that assume the video coming into them is 29.97fps in both the settings and math calculations, which hasn't been used in broadcasting since the days of standard def. Even frame syncronizers - the workhorse devices that cross-convert and conform video feeds into whatever standards we need - still push out software that claims an output of 29.97fps when it's really pumping out 59.94fps. Not to mention, when the marketing staff puts together an on-air read to tell you how super-duper-awesome their new super-slow-mo cam-du-jour is, I can't tell you how many times I hear on-air talent still use the "regular cameras shoot in 30 frames a second but ours shoots 1,000!!@!" technical explanation, which just flat out isn't true anymore and hasn't been for nearly a decade.
If it's HD, you're more than likely staring at 59.94fps. In fact, any time you see an HD picture that is in 29.97fps, people immediately ask "Hey, why is that picture strobing?" This was a huge problem back when GoPros could only do 1080p at 30fps. Anyone who wasn't smart enough to set the cameras to 720p and upconvert them was met with very substandard results.
The only reason this hasn't come to a head sooner than this, is most of the time this poorly-written software and it's completely inaccurate timing isn't used as an official timing device to determine the outcome of a game. It was only a matter of time, pun not intended.
Australia no longer accepts signatures at all. August last year it became chip & pin only
Untrue. I was there in March of this year, and made north of 35 signature transactions up and down the entire east coast on at least two different cards. For cards without chips, Visa tells you specifically that all merchants that accept their cards are REQUIRED to accept signatures. Their travel department goes as far as to tell you that if you are refused a transaction because a merchant refuses to accept a signature as verification, to call Visa collect from the store and they will straighten things out for you.
I imagine that policy will now change starting tomorrow, but until that point - including early this year - they accepted signatures.
It all depends on your company and how they communicate, but I just cancelled a policy with a AAA agent last month because his office was 0/3 in returning voicemails. I had my policy redone with another office, where I get much better customer service.
That being said, I deal exclusively with a troublesome student loans company by phone and voicemail - because by law I can record those conversations. Inflection, sarcasm, level of actual interest, country of origin of your telephone representative - all of those things are lost in email. Think of your personal conversations too. Sobriety, amount of distractions, even some subtle location information. All gone with texts and email.
Thanks, but no thanks. I'll keep my voicemail.
I work for a sizable sports network. Sony had a ton of inventory purchased across many networks to promote the release. They pulled ALL of it, ridiculously close to airtime. Way closer than we normally allow.
They were negotiating down to the wire to not have to cancel this movie. And why wouldn't they? They stand to lose tens of millions unless they're smart about how they do a private release now.
Trust me. Sony has released FAR shittier movies than this. This one had buzz going for it. Remember that months ago, NK declared it an act of war.
This looks completely legit. A ridiculously weak - and in my mind completely wrong - move, but legit.
The reason I don't own Nest or any other "learning" gear is two fold. First, I don't want any third party to know my settings and be able to deduce when I'm home. Second - and more importantly - I don't want my devices to "think" for me.
I keep a very irregular schedule that is the polar opposite of my wife. I work nights, she works days. My work nights vary wildly (I'm a contractor), hers do not (minus holidays or professional development days). Any "learning" a thermostat does in our household will be wrong.
For this purpose, I homebrewed a thermostat. I have an Omnistat with serial control, and I wrote a Raspberry Pi interface to talk to it. I then wrote an Android app to interface to the Raspberry Pi, so I can control the thermostat from inside the home or outside.
Why did I go to all of that trouble? Because there is no product on the market that fits my two criteria - no outside party data collecting, and no "thinking".
Seriously, why is this so hard? I understand the want to make things simple for the non-techies out there... but why in the world can't you offer me the option to strip everything away and use the thermostat in the simplest manner possible?
I'm having the same problem with lighting control right now. I would like a GPI contact closure to turn on/off an LED light dimmer, but never inhibit its ability to be turned on locally. You may say "Z-Wave!" or any of the other RF controls out there. The problem is that none of these meet my criteria for dimmable LED lighting: the fact that I hate software dimmers, and the ability to turn on/off a light to the set dim point without being able to inhibit the light from being turned on locally. All I want is a physical dimming slider and an on/off switch - not a software dimmer that gradually fades the output up and down and that you have to stare at LEDs to set once the unit is on. If I can't hit the switch and have an instant on with 100% certainty at what dimming level the light will pop on at, I don't want it.
My next house project will be a low voltage relay to grab the sunrise/sunset times, and turn my exterior LV lights on at sunset + 30min, and off at sunrise - 30min. Nothing outside of a photosensor does that now, and it doesn't do it reliably (think cloudy days, snow cover, etc). So I will homebrew it. And be happy.
Give me total control of my devices, with no "thinking" whatsoever. That's all I want in home automation. No one is doing that right now, and it frustrates me to no end.
Now if we can just add language to somehow apply this to apps...
A commercial entity being allowed to download all of the info out of my smart phone makes me no less comfortable than the government doing it. Especially when it's through a trojan horse such as Candy Crush or Angry Birds.
This is the only reason I root my Android. If reasonable restrictions were in place, I wouldn't need to. But until the advertising giant and information harvester that writes the OS has a change of heart, I will continue to restrict said access through any means necessary.
1) I need a car that will do 80 consecutive miles without a charge.
2) I need a car that can go 300 or 400 miles - in whatever manner.
That means one of three things:
1) A rapid charge after 80 miles (sub 15 minutes)
2) One of those sweet-looking NASCAR-esque battery swaps that Tesla does, or
3) A hybrid
It also has to hit a reasonable price point to make it comparable to an efficient gasoline burner, i.e. sub-$45,000.
Less than 80 consecutive miles and my initial purchase cost is no longer offset by the fuel savings. Less than a 400-mile trip and I have to own a second car for business trips. Both of these are show-stoppers for me, and anyone else who has any sort of reasonable daily work commute.
If Tesla can achieve their goal of making a car for $35,000 - I'm in. If I can get a plug-in hybrid with a battery pack that will go 80 miles, I'm in. Until then, I'm stuck with high-MPG gas burners - which for the time being are still more cost efficient over their life span.
Though ideologically, even at a higher price point, I'd be more than happy to stop purchasing gas, even at a higher overall cost. I'm just waiting for someone to make a practical vehicle that will let me do just that.
As someone who puts those shows up in the air, I'll tell you it all depends on what you're looking for.
If you're looking for sports backhauls, you'll most likely be disappointed. Almost every professional sports venue in the country uses fiber as a backhaul, not satellite. The only places that use birds are places that do small numbers of broadcasts infrequently (think college campuses). And even then, a lot of networks have policies that require them to use BISS encryption. So unless you're friends with someone in the uplink world (because yes, they do share downlink info and BISS codes with each other so they can watch live events while on the road) you'll find the content to be sparse.
If you're looking for TV networks, look at it this way. If there's any value to the network (i.e. if it's on anything but the base tier of your cable or satellite operator), it's encrypted. Because why would they give away for free what they're getting $1.60/subscriber to sell? You'll find some foreign networks and stuff you most likely don't care about, but that's about it. If you know the timing you might find syndicated shows being fed to your local TV stations (think Ellen or Judge Judy) or something mildly useful like that, but even more of those shows are now being BISS encrypted. The only reason more syndicated shows don't encrypt is because they get sick of having to pay to re-feed shows because of inept downlink ops.
The holy grail for FTA is finding "wild feeds" - temporary uplinks from site to a network (think breaking news). You can find some serious hilarity here sometimes. But the feeds come and go in a matter of quarter hours, so they're tougher to find.
The feeds are out there, but there's not a lot of FTA ones in North America. Further complicating things is the myriad of encoding specs (bitrate, constellation, FEC, encoder model, etc etc etc). It ends up being a total crapshoot trying to find things. So I guess what I'm saying is it depends on what you're looking for. If you're doing this as a hobby to see what you can find, it can be a lot of fun and even rewarding at times. If you're looking to replace cable, you're going to wish you'd spent your money on a Roku or a Slingbox at your friend's house instead.
I'm an audio mixer for several of the national and regional networks. I deal almost exclusively in live sports, and I can tell you we are monitored to a ridiculous degree. We have averaging meters in our trucks (measured in LKFS), and the TOC monitors the show AND commercials (in DB on a 3s average). The TOC logs the averages with timecode and video thumbnails (for reference) and saves them, as they are the only defense they have against CALM complaints. The TOC is quick to notify us during the show if we're too loud or too quiet and the averaging is out of compliance.
The problem is, no one at home is smart enough to know the difference between a national spot, a local spot, and a spot that your cable provider inserts. So the complaint becomes "Fox Sports played a loud commercial!!1!!!1!!!one!!!" when the culprit is actually the Comcast head-end in Gary, Indiana.
Between the meters, the logging, and the constant monitoring, broadcast is jumping through a lot of hoops to be CALM compliant. But the networks don't have end-to-end control of their signal, and the end user is at the mercy of their local cable headend. Almost all of the problems you experience happen there. I can't tell you how many times we find a surround downmix where the announcers are almost inaudible, because a cable operator (and sometimes even a satellite provider) is doing an improper downmix, and the 4.1 channels are blowing out the center on the stereo feed. The networks try to QC as much as they can - most of the network offices have receivers for every cable and satellite (and FiOS, AT&T, etc) service they can get their hands on, and constantly monitor as many of them as they can - trying to find and fix the problems proactively rather than wait for the vague and usually inaccurate complaints to roll in from the FCC.
A little OT, but: They'd be useful for allowing the producers to zoom way in on replays without having to lose resolution downstream. You'd only need the high bandwidth between the camera and the production booth/truck. Or do they already do this and that's what you're talking about?
That's what the existing 4K X-MO cameras are doing - recording everything at high framerate and then zooming in. At Olympics, they would mount one over the pool, and then zoom in to whatever lanes were of interest, in full HD resolution, still in super slow-mo. For football, they have the camera shoot wide as an "All-22", and then they can zoom into anything in the play that was of interest. It's a great idea, it's like having an iso of everyone all the time. The zooming software was clunky, but that gets polished over time.
4K is a perfect medium for film. Film is already large format. Every aspect of production ground-up is based around large format, and since it's not live or real-time, you can take more time to ensure quality compression. All you need is a bluray spec and an HDMI/Component spec, and you're good to go.
Broadcast is an uphill battle, because there are bottlenecks at every point along the transmission line. 4K cameras need SMPTE fiber, and most facilities are still only wired for copper triax. The switcher upgrade isn't a huge technical problem, but digital replay is - since the bandwidth is going up orders of magnitude(1). The UHD (4K) SDI video transport spec isn't even finalized yet, but it's looking to be between 6-12Gb/s, 4-8x current HD bandwidth. Most fiber transmission lines are still only 270M/s, not even enough to fit a full HD signal at 1.5Gb/s (and most cheap networks only use 40-80Mb/s on their backhauls for cost reasons)... and you're basically ruling out satellite, since pushing that much data saturates a good portion of the bird AND leaves you even more susceptible to issues from bad weather. Then once you get it to your cable provider, most HD channels they push out are between 3-12Mb/s, meaning a 4K channel - even if it takes up the space of 4 or 5 HD channels - will have the life squeezed out of it by the time it reaches the end-user. And considering broadcasters still can't even squeeze 1080p out of OTA, there's little chance you'll see a major network adopt it.
My guess is that film will be the deciding factor as to if 4K lives or dies as a spec. If enough people see the quality improvement (read: if enough people buy new 65"+ TVs or projectors) then broadcast will make a concerted effort to fill the content void. If everyone shrugs off 4K because they're watching it on their cheap 46" 720p flatscreens, it will dissolve just as quickly as 3D. But 4K has one major advantage over 3D... the end-user isn't required to wear those stupid polarizing glasses. That in itself may give the format life where 3D failed miserably.
(1) There are some highly specialized 4K X-MO cameras out there (SNF/MNF have experimented with it, there was also a few working rigs at the Olympics) - but the rigs required to run them are pretty insane... it requires bonding 8-16 fibers to transfer the data, and trays of hard drives to store only 20-30 seconds of replay data at 240fps. They're neat "toys", just not very practical.
I have hardly ever known a mathematician who was capable of reasoning. -- Plato