121944368
submission
dryriver writes:
The 2010s were not necessarily the greatest decade to live through. AAA computer games were not only DRMd and Internet tethered to death but became increasingly formulaic and pay-to-win driven, and poor quality console ports pissed off PC gamers. Forced software subscriptions for major software products you could previously buy became a thing. Personal privacy went out the window in ways too numerous to list, with lawmakers failing on many levels to regulate the tech, data mining and internet advertising companies in any meaningful way. Severe security vulnerabilities were found in hundreds of different tech products, from Intel CPUs to baby monitors and Internet connected doorbells. Thousands of tech products shipped with microphones, cameras and internet connectivity integrated that couldn't be switched off with an actual hardware switch. Many electronics products became harder or impossible to repair yourself. Printed manuals coming with tech products became almost non-existent. Hackers, scammers, ransomwarers and identity thieves caused more mayhem than ever before. Trollfarms, Clickfarms and fake news factories damaged the integrity of the Internet as an information source. Tech companies and media companies became afraid of pissing off the Chinese government. Windows turned into a big piece of spyware. Intel couldn't be bothered to innovate until AMD Ryzen came along. Nvidia somehow took a full decade to make really basic realtime raytracing happen, even though smaller GPU maker Imagination with a fraction of the budget had done it years earlier, and in a mobile GPU to boot. Top of the line smartphones became seriously expensive. Censorship and shadow banning on the once more open Internet became a thing. Easily triggered people trying to muzzle other people on social media became a thing. The quality of popular music and music videos went steadily downhill. Star Wars went to shit after Disney bought it, as did the Star Trek films. And mainstream cinema turned into an endless VFX heavy comic book movies, remakes/reboots and horror movies fest. In many ways, Television was the biggest winner of the 2010s, with many new TV shows with film-like production values being made. The second winner may be computer hardware that delivered more storage/memory/performance per Dollar than ever before. To the question: What, dear Slashdotters, will the 2020s bring us? Will things get better in tech and other things relevant to nerds, or will they get worse?
121813862
submission
dryriver writes:
Santa — flying over Australia right now — is flying across the globe and dropping gifts along the way. On this website [spam URL stripped] we can watch him flying over a Google Earth style 3D satellite map in realtime.Link to Original Source
121794318
submission
dryriver writes:
We've all seen the DeepFake videos on Youtube, where a different actor's face from the original is digitally inserted into a film scene. Some of these DeepFakes are actually quite convincing. DeepFakes are currently computationally intensive, but may one day happen in realtime on hardware custom made to accelerate the process. Now to the question: Will this "digital face swapping" be a realtime feature in future TVs some day? Will people be able to say to their TV "I don't like this actor/actress. Replace him/her with _actorname_ please"? Or watch a 100 Million Dollar movie with their own face on an actor's body, essentially making the TV owner the star of the movie playing? Will this perhaps become so normal some day that people in the future look back at our era and say "In those days, you couldn't choose which actors to watch any given piece of content with. Technology wasn't as advanced as it is today back then."?
121536868
submission
dryriver writes:
27 years ago, the BBC's "Tomorrow's World" show broadcast this little gem of a program — [spam URL stripped]?... . After showing old Red-Cyan Anaglyph movies, Victorian Stereoscopes, lenticular printed holograms and a monochrome laser hologram projected into a sheet of glass, the presenter shows off a stereoscopic 3D CRT computer display with active shutter glassses. The program then takes us to a laboratory at Masachussetts Institute Of Technology, where a supercomputer is feeding 3D wireframe graphics into the world's first glasses-free holographic 3D display prototype using a Tellurium Dioxide crystal. One of the researchers at the lab predicts that "years from now, advances in LCD technology may make this kind of display cheap enough to use in the home". A presenter then shows a bulky plastic VR headset larger than an Occulus Rift and explains how VR will let you experience completely computer generated worlds as if you are there. The presenter notes that 1992 VR headsets may be "too bulky" for the average user, and shows a mockup of much smaller VR glasses about the size of Magic Leap's AR glasses, noting that "these are already in development". What is astonishing about watching this 27 year old TV broadcast is a) the realization that much of today's stereo stereo 3D tech was already around in some form or the other in the early 1990s, b) VR headsets took an incredibly long time to reach the consumer and are still too bulky, and that c) almost 3 decades later, MIT's prototype holographic glasses-free 3D display technology never made its way into consumer hands or households.Link to Original Source
120785162
submission
dryriver writes:
The sheer extent of the data privacy catastrophe happening — everything soft/hardware potentially spies on us, and we don't get to see what is in the source code or circuit diagrams — got me thinking about an intriguing possibility. Will it ever be possible to design and manufacture your own CPU, GPU, ASIC or RAM chip right in your own home? 3D printers already allow 3D objects to be printed at home that would previously have required an injection molding machine. Inkjet printers can do high DPI color printouts at home that would previously have required a printing press. Could this ever happen for making computer hardware? A compact home machine that can print out DIY electronic circuits right in your home or garage? Could this machine look a bit like a large inkjet printer, where you load the electronics equivalent of "premium glossy photo paper" into the printer, and out comes a printed or etched or otherwise created integrated circuit that just needs some electricity to start working? If such a machine or "electronics printer" is technically feasible, would the powers that be ever allow us to own one?
120599390
submission
dryriver writes:
The BBC reports: More than 140,000 people died from measles last year as the number of cases around the world surged once again, official estimates suggest.
Most of the lives cut short were children aged under five.
Henrietta Fore, Unicef's executive director, said: "The unacceptable number of children killed last year by a wholly preventable disease is proof that measles anywhere is a threat to children everywhere."
Dr Seth Berkley, chief executive of Gavi, the Vaccine Alliance, said: "It is a tragedy that the world is seeing a rapid increase in cases and deaths from a disease that is easily preventable with a vaccine.
"While hesitancy and complacency are challenges to overcome, the largest measles outbreaks have hit countries with weak routine immunisation and health systems."
Prof Larson said: "These numbers are staggering. Measles, the most contagious of all vaccine-preventable diseases, is the tip of the iceberg of other vaccine-preventable disease threats and should be a wake-up call."
The situation has been described by health experts as staggering, an outrage, a tragedy and easily preventable with vaccines.
Huge progress has been made since the year 2000, but there is concern that incidence of measles is now edging up.
In 2018, the UK — along with Albania, the Czech Republic and Greece, lost their measles elimination status.
And 2019 could be even worse.
The US is reporting its highest number of cases for 25 years, while there are large outbreaks in the Democratic Republic of Congo, Madagascar and Ukraine.
The Pacific nation of Samoa has declared a state of emergency and unvaccinated families are hanging red flags outside their homes to help medical teams find them.Link to Original Source
119927694
submission
dryriver writes:
When you run a realtime video processing algorithm on a GPU, you notice that some math functions execute very quickly on the GPU and some math functions take up a lot more processing time or cycles, slowing down the algorithm. If you were to implement that exact GPU algorithm as a dedicated ASIC hardware chip or perhaps on a beefy FPGA, what kind of speedup — if any — could you expect over a midrange GPU like a GTX 1070? Would hardwiring the same math operations as ASIC circuitry lead to a massive execution time speedup as some people claim — e.g. 5 X or 10 X faster than a general purpose Nvidia GPU — or are GPUs and ASICs close to each other in execution speed? Bonus question: Is there a way to calculate the speed of an algorithm implemented as an ASIC chip without having an actual physical ASIC chip produced? Could you port the algorithm to, say, Verilog or similar languages and then use a software tool to calculate or predict how fast it would run if implemented as an ASIC with certain properties (clockspeed, core count, manufacturing process... )?
119490724
submission
dryriver writes:
Everybody seems to think these days that kids desperately need to learn how to code, or similar, when they turn six years old. But this ignores a glaring fact — the biggest shortage in the future labor market is not people who can code competently in Python, Java or C++, it is people who can actually discover or invent completely new and better ways of doing things, whether this is in CS, Physics, Chemistry, Biology or other fields. If you look at the history of great inventors, the last truly gifted, driven and prolific non-corporate inventor is widely regarded to be Nikola Tesla, who had around 700 patents to his name by the time he died. After Tesla, most new products, techniques and inventions have come out of corporate, government or similar structures, not from a good old fashioned dedicated, driven, independent-minded one-person inventor who feverishly dreams up new things and new possibilities and works for the betterment of humanity. How do you teach inventing to kids? By teaching them the methods of Altshuller ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2F... ) for example. Seriously, does teaching five to seven year olds 50 year old CS/coding concepts and techniques do more for society than teaching kids to rebel against convention, think outside the box, turn convention upside down and beat their own path towards solving a thorny problem? Why does society want to create an army of code-monkeys versus an army of kids who learn how to invent new things from a young age? Or don't we want little Nikola Teslas in the 21st Century, because that creates "uncertainty" and "risk to established ways of doing things"?
119398904
submission
dryriver writes:
This is an episode of the BBC's Tomorrow's World program broadcast all the way back in 1984 ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3F... ) where a presenter shows hands-on how a laser hologram of a real world object can be recorded onto a transparent plastic medium, erased again by heating the plastic with an electric current, and then re-recorded differently. The presenter states that computer scientists are very interested in holograms, because the future of digital data storage may lie in them. This was 35 years ago. Holographic data storage for PCs, smartphones et cetera still is not available commercially. Why is this? Are data storage holograms too difficult to create? Or did nobody do enough research on the subject, getting us all stuck with mechanical harddisks and SSDs instead? Where are the hologram drives that appeared "so promising" three decades ago?
118931054
submission
dryriver writes:
Now that we have arrived in Blade Runner's November 2019 "future", the BBC asks what the 37 year old film got right: Beyond particular components, Blade Runner arguably gets something much more fundamental right, which is the world’s socio-political outlook in 2019 – and that isn’t particularly welcome, according to Michi Trota, who is a media critic and the non-fiction editor of the science-fiction periodical, Uncanny Magazine. “It's disappointing, to say the least, that what Blade Runner ‘predicted’ accurately is a dystopian landscape shaped by corporate influence and interests, mass industrialisation's detrimental effect on the environment, the police state, and the whims of the rich and powerful resulting in chaos and violence, suffered by the socially marginalised.” [...] As for the devastating effects of pollution and climate change evident in Blade Runner, as well as its 2017 sequel Blade Runner 2049, “the environmental collapse the film so vividly depicts is not too far off from where we are today,” says science-fiction writer and software developer Matthew Kressel, pointing to the infamous 2013 picture of the Beijing smog that looks like a cut frame from the film. “And we're currently undergoing the greatest mass extinction since the dinosaurs died out 65 million years ago. In addition, the film's depiction of haves and have-nots, those who are able to live comfortable lives, while the rest live in squalor, is remarkably parallel to the immense disparity in wealth between the world's richest and poorest today. In that sense, the film is quite accurate.” [...] And it can also provide a warning for us to mend our ways. Nobody, surely, would want to live in the November 2019 depicted by Blade Runner, would they? Don’t be too sure, says Kressel.
“In a way, Blade Runner can be thought of as the ultimate cautionary tale,” he says. “Has there ever been a vision so totally bleak, one that shows how environmental degradation, dehumanisation and personal estrangement are so harmful to the future of the world?
“And yet, if anything, Blade Runner just shows the failure of the premise that cautionary tales actually work. Instead, we have fetishised Blade Runner's dystopian vision. Look at most art depicting the future across literature, film, visual art, and in almost all of them you will find echoes of Blade Runner’s bleak dystopia.
“Blade Runner made dystopias ‘cool’, and so here we are, careening toward environmental collapse one burned hectare of rainforest at a time. If anything, I think we should be looking at why we failed to heed its warning.”
118592642
submission
dryriver writes:
Using a compressed disk drive or harddrive is an old hat that has been possible for decades now. But when you do this in software/OS, the CPU does the compressing and decompressing. Are there any harddrives or SSDs that can work compressed using their own built in hardware for this? I am not talking about realtime video compression using a hardware CODEC chip — this does exist and is used — but rather a storage medium that compresses every possible type of file using its own compression and decompression realtime hardware without a significant speed hit.
118526296
submission
dryriver writes:
More and more professional 3D software like 3DMax, Maya, AutoCAD (Autodesk) and Substance Painter (Adobe) is now only available on a monthly or yearly subscription basis — you cannot buy any kind of perpetual license for these industry standard 3D tools anymore, cannot offline install or activate the tools, and the tools also phone home every few days over the internet to see whether you have "paid your rent". Stop paying your rent, and the software shuts down, leaving you unable to even look at any 3D project files you may have created with software. This has caused so much frustration, concern and anxiety among 3D content creators that, increasingly, everybody is trying to replace their commercial 3D software with Open Source 3D tools. Thankfully, open source 3D tools have grown up nicely in recent years. Some of the most popular FOSS 3D tools are the complete 3D suite Blender ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.blender.org%2F ), polygon modeling tool Wings 3D ( http://www.wings3d.com/ ), polygon modeling tool Dust3D ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fdust3d.org%2F ), CAD modeling tool FreeCAD ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.freecadweb.org%2F ), PBR texturing tool ArmorPaint ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Farmorpaint.org%2F ), procedural materials generator Material Maker ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Frodzilla.itch.io%2Fmater... ), image editing tool GIMP ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.gimp.org%2F ), painting tool Krita ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fkrita.org%2Fen%2F ), vector illustration tool Inkscape ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Finkscape.org%2F ) and the 2D/3D game engine Godot Engine ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fgodotengine.org%2F ). Along with these tools comes a beguiling possibility — while working with commercial 3D tools pretty much forced you to use Windows X in terms of OS choice in the past, all of the FOSS 3D tool alternatives have Linux versions. This means that for the first time, professional 3D users can give Windows a miss and work with Linux as their OS instead.
118509926
submission
dryriver writes:
CNN reports: Many of our favorite fast food and restaurant chains continue to contribute to the growing threat of antibiotic resistance, according to a report released Thursday by advocacy groups. The World Health Organization calls the development of bacteria that can't be killed by some of our current medicines "one of the biggest threats to global health, food security, and development today." Fifteen of America's favorites received an "F" for their lack of action in reducing the use of beef raised with antibiotics, including Burger King, DQ, Jack In the Box, Pizza Hut, Olive Garden, Chili's, Sonic, Applebee's and the pizza chains of Domino's, Little Caesars and Pizza Hut. Only Chipotle and Panera Bread, who were early leaders in using only antibiotic-free beef and chicken, received an "A." This is the fifth year that six public interest groups have graded the 25 largest US fast food chains on where they stand on antibiotics. The report, called Chain Reaction V, focuses on antibiotic use in both poultry and beef food items. Antibiotics are routinely given to animals to keep them healthy while they fatten up for slaughter. In fact, nearly two-thirds of the medically important antibiotics sold in the US go to food animals.When antibiotics are overused, some bacteria learn to survive, multiply, and share their resistance genes with other bacteria even if those have not been exposed. Those so-called "superbugs" enter our system when we eat undercooked meat or veggies that have been exposed to irrigation water contaminated with animal waste. And suddenly, antibiotics that once cured our infections no longer do their job. Despite the severity of the problem, the US lacks appropriate laws to regulate overuse of antibiotics in our food chain. Thus advocacy groups have turned to some of the largest buyers of raw beef and chicken — restaurants — and asked them to use their purchasing power to force change.
118444442
submission
dryriver writes:
When McDonald's closed all its restaurants in Iceland in 2009, one man decided to buy his last hamburger and fries. "I had heard that McDonald's never decompose so I just wanted to see if it was true or not," Hjortur Smarason told AFP. This week, it's 10 years since the seemingly indestructible meal was purchased, and it barely looks a day older. Curious observers can watch a live stream ( https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fsnotrahouse.com%2Flast-m... ) of the burger and fries from its current location in a glass cabinet in Snotra House, a hostel in southern Iceland. "The old guy is still there, feeling quite well. It still looks quite good actually," the hostel's owner Siggi Sigurdur told BBC News. "It's a fun thing, of course, but it makes you think about what you are eating. There is no mould, it's only the paper wrapping that looks old." The hostel claims that people come from around the world to visit the burger, and the website receives up to 400,000 hits daily.
118440762
submission
dryriver writes:
Several rival companies may be hard at work trying to get Elon Musk's Hyperloop concept off the ground, but hurtling across country — maybe even across continents — at 600 miles per hour in a low-pressure steel tube still feels far from reality. But 13-year-old New York student Caroline Crouchley may have invented a more economically viable and eco-friendly Hyperloop solution. Crouchley's idea, which just won second place in the annual 3M Young Scientist Challenge, is to build pneumatic tubes next to existing train tracks. Magnetic shuttles would travel through these vacuum tubes, connected via magnetic arm to trains traveling on the existing tracks. This system would utilize current train tracks, thereby cutting infrastructure costs and, Crouchley says, eradicating the potential safety risk posed by propelling passengers in a vacuum. There'd be no need for trains to use diesel or electric motors, making the trains lighter and more fuel-efficient. This is important to Crouchley, who aims to devise active solutions to the climate crisis. "I pinpointed transportation as something I wanted to work on because if we can make trains more efficient, then we can eliminate the amount of cars, trucks and buses on the road," Crouchley tells CNN Travel.