Apple WatchOS 7 Unveiled With Sleep Tracking, Shareable Watch Faces and a Hand Washing Timer

See the original posting on Slashdot

At the WWDC 2020 keynote today, Apple unveiled WatchOS 7, adding sleep tracking features, shareable watch faces, a hand washing timer, and much more. CNET reports: WatchOS 7 adds much-anticipated sleep tracking features, including a Wind Down mode to help you get to bed on time. When you wake up, you’ll see a visualization of your previous night’s sleep, including periods of wake and sleep, and a chart showing weekly sleep trends. While the watch lacks a watch face store, the new software adds the ability to share customized watch faces and complications. If a shared watch face uses apps you don’t have, you’ll be able to download them easily. You can also easily share a face you’ve created yourself with a contact, or across social media.

In terms of health and fitness, WatchOS 7’s Workout app adds support for four new workouts: core training, dance, functional strength training and cool down. The Activity app where you track your workouts will now be called Fitness, and will include a new Summary tab that gives you an easy way to see your activity history, workouts and trends, all in one view. With WatchOS 7, you’ll be able to use Siri to translate several languages directly from your wrist. Like on iOS 14, in the Apple Maps app on WatchOS 7, you’ll be able to see cycling-specific directions on your watch in some cities. Amid the coronavirus pandemic, WatchOS will use machine learning to determine when you start washing your hands, and sets a timer so you know you’re getting the recommended 20-second wash.

Read more of this story at Slashdot.

Apple Will Let You Emulate Old Apps, Run iOS Apps on ARM Macs

See the original posting on Slashdot

At the WWDC 2020 keynote today, Apple announced that the company is going to switch from Intel chips to Apple’s own silicon, based on ARM architecture. They also announced that iPad and iPhone apps will be able to run natively on ARM-powered Macs. TechCrunch reports: First, you’ll be able to compile your app to run both on Intel-based Macs and ARM-based Macs. You can ship those apps with both executables using a new format called Universal 2. If you’ve been using a Mac for a while, you know that Apple used the same process when it switched from PowerPC CPUs to Intel CPUs — one app, two executables. As for unoptimized software, you’ll still be able to run those apps. But its performances won’t be as good as what you’d get from a native ARM-ready app. Apple is going to ship Rosetta 2, an emulation layer that lets you run old apps on new Macs.

When you install an old app, your Mac will examine the app and try to optimize it for your ARM processor. This way, there will be some level of optimization even before you open the app. But what if it’s a web browser or a complicated app with just-in-time code? Rosetta 2 can also translate instructions from x86 to ARM on the fly, while you’re running the app. And if you’re a developer working on code that is going to run on servers, Apple is also working on a set of virtualization tools. You’ll be able to run Linux and Docker on an ARM Mac.

As a bonus, users will also be able to access a much larger library of apps. “Mac users can for the first time run iOS and iPadOS apps on the Mac,” Apple CEO Tim Cook said. While the company didn’t share a lot of details, Apple isn’t talking about Catalyst, its own framework that makes it easier to port iOS apps to macOS. You should be able to download and run apps even if the developer never optimized those apps for macOS.

Read more of this story at Slashdot.

Reporter Tests Walmart’s $140 Laptop ‘So You Wouldn’t Have To’

See the original posting on Slashdot

Ars Technica’s technology reporter Jim Salter tested Walmart’s 11.6-inch EVOO laptop, which sells for $139 and ships with just 2GiB of RAM and a 32GB SSD, which he worries “simply is not enough room for Windows itself, let alone any applications.”
The first thing I noticed while looking through the Windows install is that our “internal” Wi-Fi is actually a cheap USB 2.0 Realtek adapter — and it’s 2.4GHz-only 802.11n, at that. The second thing I noticed was the fact that I couldn’t install even simple applications, because the laptop was in S mode. For those unfamiliar, S mode locks a system into using only the Edge browser and only apps from the Microsoft Store. Many users end up badly confused by S mode, and some unnecessarily buy a new copy of Windows trying to get out of it. Fortunately, if you click the “learn more” link in the S mode warning that pops up when you attempt to load a non-Store app, you are eventually led to a free Microsoft Store app which turns S mode off. On my first try, this app crashed. But on the second, it successfully disabled S mode, leaving me with a normal Windows install….
I verified that I was on an older version of Windows 10 — build 1903, from March 2019 — and initiated an upgrade to build 2004, from April 2020. Windows 10 was having none of it. It wanted at least 8GiB of free space on C:, and I couldn’t even get to 6GiB free, after only a day of using the system…. Meaningful benchmark results were impossible to attain on this laptop, since it was too slow and quirky to even run the benchmarks reliably. But I didn’t let a silly thing like “being obviously inappropriate” stop me from slogging painfully through the benchmarks and getting what numbers I could. The first suite up, PCMark 10, eventually produced a score of zero. I didn’t know that a zero score was even possible. Apparently, it is… Cinebench R20 also took several tries to complete successfully, and eventually the test produced a jaw-droppingly bad score of 118…

Under Fedora 32 — selected due to its ultra-modern kernel, and lightweight Wayland display manager — the EVOO was incredibly balky and sluggish. To be fair, Fedora felt significantly snappier than Windows 10 had on this laptop, but that was a very, very low bar to hurdle. The laptop frequently took as long as 12 seconds just to launch Firefox. Actually navigating webpages wasn’t much better, with very long pauses for no apparent reason. The launcher was also balky to render — and this time, with significantly lower memory usage than Windows, I couldn’t just blame it on swap thrashing… [W]ith the laptop completely open, several questions are answered — the reason I hadn’t heard any fan noise up until this point is because there is no fan, and the horrible CPU performance is because the CPU can’t perform any better than it does without cooking itself in its own juices….
At first, I mistakenly assumed that the A4-9120 was just thermally throttling itself 24/7. After re-assembling it and booting back into Fedora, I found the real answer — the normally 2.5GHz chip is underclocked to an anemic 1.5GHz. The system BIOS confirms this clockrate but offers no room to adjust it — which is a shame, since the system never hit temperatures higher than about 62C in my testing.
His verdict? Walmart’s EVOO laptop “doesn’t have either the RAM or the storage to do an even vaguely reasonable job for normal people doing normal things under Windows, even when limited to S mode…
“There may be a purpose this laptop is well-suited to — but for the life of me, I cannot think what it might be.”

Read more of this story at Slashdot.

David Heinemeier Hansson Explains What It Takes to Write Great Code

See the original posting on Slashdot

The “bespoke development” site Evrone.com (an IT outsourcing company) interviewed Ruby on Rails creator David Heinemeier Hansson (who is also co-founder and CTO of Basecamp — and a racecar driver) shortly before he spoke at RubyRussia, Evrone’s annual Moscow programming conference.
And they asked him an interesting question. As a man who’s seen lots of Ruby code, “what makes code good or shitty? Anything that is obvious for you at first glance?”
David Heinemeier Hansson: If the code is poorly written, usually it smells before you even examine the logic. Indentation is off, styles are mixed, care is simply not shown. Beyond that, learning how to write great code, is a life long pursuit. As I said in my RailsConf 2014 keynote, we’re not software engineers, we’re software writers. “Writing” is a much more suitable metaphor for what we do most of the time than “engineering” is. Writing is about clarity and presenting information in a clear-to-follow manner so that anybody can understand it.

There’s no list of principles and practices that somebody can be taught and then they will automatically produce clear writing every time. If you want to be a good writer, it’s not enough just to memorize the dictionary. Just knowing the words available to you, knowing the patterns of development is not going to make you a good developer. You have to develop an eye. You have to decide that the most important thing for your system is clarity. When you do decide that, you can start developing an eye.

The only way to become a good programmer, where, by definition, I define good programmers as somebody who writes software with clarity, is to read a lot of software and write a lot of software.

In 2016, David Heinemeier Hansson answered questions from Slashdot readers.

Read more of this story at Slashdot.

Stack Overflow Explores Why Developers Love TypeScript More Than Python

See the original posting on Slashdot

Stack Overflow asked 65,000 programmers for their favorite programming language, and this year Microsoft’s TypeScript knocked Python from the #2 spot. So they interviewed Microsoft’s principal engineering lead for the language “to find out what about TypeScript makes it so dang lovable.”
Q: Do you remember why the team came up with TypeScript, why you wanted to release something like this?

A: When I joined the team, there were a lot of people at Microsoft who wanted to develop JavaScript at what we call “application scale.” Teams like TFS and Office wanted to build large JavaScript applications. A lot of those people had familiarity with statically-typed languages — C++, C#, Java, that kind of thing. They wanted to have that static typing available both for conceptual scalability and for the tooling…
Q: Was there a point where you saw an adoption point of no return? Was there something that came along where people were like, oh, yeah, we do TypeScript now?
A: Oh, it was definitely Google announcing that they were going to use TypeScript with Angular. That’s kind of lost to time now. But if you look at the graphs for TypeScript, literally any graph — GitHub stars, downloads, pull requests — you can see the exact point when that Angular announcement came out. And the graph just changes. It never looks back… TypeScript shores up that last rough edge on JavaScript and gives you something that’s just really fun to work with and runs everywhere. I think if TypeScript were a language that was built on top of a less universal language or a less fun language, I don’t think it would be as successful. It’s really taking something that’s great and making it better…

I think my favorite thing that I see is people on the Internet saying, ‘I did this huge refactoring in TypeScript and I was refactoring for three hours. And then I ran my code and it worked the first time.’ In a dynamic language, that would just never, ever happen….

I would just say to people, if static types aren’t a good fit for you, for either your programming style or the problem you’re working on, just skip it. That’s fine. It’s okay. I won’t be offended. If someone can get a thirty thousand line application that gets its job done without static types, I’m very impressed. That just seems really difficult. But kudos to those people who make it work. Python’s the same way. Very few people have working Python type annotations, but Python is incredibly popular. I think the data speaks for itself — I think Python is number three in the survey… I guarantee you that a very small proportion of those Python developers have static types. Whatever your problem domain is, that might be the best fit for you.

Read more of this story at Slashdot.

Microsoft’s GitHub Offers Open-Source Developers ‘One Linter to Rule Them All’

See the original posting on Slashdot

“GitHub says it’s open-sourcing its in-house linting tool, the GitHub Super Linter, to clean up code,” reports ZDNet:

Having a tool that checks source code for programming blunders and other errors is useful for developers. Now Microsoft-owned GitHub has released the ‘Super Linter’ to help developers avoid the hassles of setting up code repositories with multiple linters…

GitHub describes it as a “simple combination of various linters, written in bash, to help validate your source code” for the purpose of preventing broken code from being uploaded to a ‘master’ branch, the key branch that other branches in a tree are merged to… The Super Linter Action lets developers ‘lint’ or check their code base using popular linters for Python, JavaScript, Go, XML, YAML, and more programming languages. As such, GitHub engineer Lucas Gravley describes the Super Linter as the “one linter to rule them all”.

“The GitHub Super Linter was built out of necessity by the GitHub Services DevOps Engineering team to maintain consistency in our documentation and code while making communication and collaboration across the company a more productive experience,” says Gravley… “When you’ve set your repository to start running this action, any time you open a pull request, it will start linting the code case and return via the Status API. It will let you know if any of your code changes passed successfully, or if any errors were detected, where they are, and what they are,” explains Gravley.
The Super Linter doesn’t fix problems but does flag them, so developers can then go back and fix them before they reach the master branch.

Read more of this story at Slashdot.

Dropbox is a Total Mess

See the original posting on Slashdot

Veteran journalist Om Malik, writing on his blog: I was reading Nikita Prokopov’s blog this morning and came across his very visual damnation of what is wrong with Dropbox. Like me, he too had thought that “in the beginning, Dropbox was great, but in the last few years, they started to bloat up.” He visually shows that as an existing customer, you need to jump through a dozen hoops to get Dropbox going on a new machine. And if you are just signing up, add another five steps. His sentiments reflect my feelings about Dropbox, as well. When I fell in love with Dropbox, it had not even launched. It was simple and elegant. It was nothing like anything I had experienced before. And I wasn’t alone. The company was one of the fastest-growing companies in Silicon Valley, because of customers appreciated their simplicity and ease of use. Their revenues and userbase grew at an astonishing speed. For nearly a decade, I stayed loyal to the service, but like Prokopov, I too felt the bloat was getting too much. […] I don’t blame Dropbox going the way they have — they are less about the individual customers and more focused on teams and corporations. That’s where the money is — and when you go public, you are all about the “quarterly goals.” You don’t go public without knowing that Wall Street owns you.

Read more of this story at Slashdot.

Lego Unveils New ‘Robot Inventor’ Mindstorms Kit

See the original posting on Slashdot

After seven years, Lego has finally unveiled a new Mindstorms kit, reports PC Magazine — the Lego Mindstorms Robot Inventor, available this fall for $359:
The Robot Inventor kit lets kids (or adults) build five different robot models out of 949 pieces, ranging from a four-legged walker to a bipedal wheeled robot that can give high-fives. All of these robots can be programmed to perform different tricks, like grabbing items, firing plastic projectiles, avoiding obstacles, and playing various sports with a ball.

The kit includes four low-profile, medium-angular motors; a color and light sensor; and a distance sensor, which work together with the Intelligent Hub block to power these robots and execute commands. Of course, like all Mindstorms kits, you can build your own robotic creations with the tools at hand, and add Lego Technic and System pieces for more complex projects.

The Intelligent Hub serves as the brain of Lego Mindstorms, and the block that houses the Mindstorms Robot Inventor Kit is the most advanced one yet. It features six input/output ports for sensors and motors, a six-axis gyro/accelerometer, a speaker, and a five-by-five LED matrix. The Intelligent Hub and all robots built with it can be controlled wirelessly over Bluetooth with the Lego Mindstorms Robot Inventor app for Android, iOS, Windows 10, and macOS. The app supports programming in both the tile-based Scratch language and in Python, for more complex projects that require the precision of written code.

Read more of this story at Slashdot.

Why One of Kubernetes’ Creators Moved From Google To Microsoft

See the original posting on Slashdot

Long-time Slashdot reader destinyland writes: One of the three Google employees who created Kubernetes — the open source container-orchestration platform now maintained by the Cloud Native Computing Foundation — was software engineer Brendan Burns. But in 2016 Burns became an engineer at Microsoft (where since March he’s been a corporate vice president at Microsoft).
This week in a new podcast interview , Burns explained why he went from Google to Microsoft, which was “all-in on cloud”:

Obviously growing up in Seattle, Microsoft is sort of like the home-town team — so that was a big plus also. And it’s been great to be able to come in and really help them figure out — I think one of the really amazing things about being there is it’s a company in transition. Certainly four years ago when I joined, it’s a company in transition. And getting a chance to help continue that transition, and help continue and shift its focus from closed-source and Windows to a really renewed focus on open source and Linux and cloud native application development — that ability to influence and help shape direction has been really awesome also.
But it was more than just their commitment to the cloud…
“There’s just such a great developer history there, of developer tooling and developer productivity. Just such a focus on empowering people to build stuff. That’s really compelling to me too, because I think one of the things we really haven’t done a good job of in Kubernetes is make it easier to build these programs. Right? We do a lot to make it easier to operate the stuff, but it’s still really hard to build these systems, and Kubernetes isn’t helping you at all. So I’m really excited and interested and thinking a lot about how can we make it easier for developers to build systems. And I think the DNA and history and experience of Microsoft to build things, the hugely successful platform that is Windows, means there’s just a great — a really strong amount of DNA about what it takes to build a platform that doesn’t just succeed for elite devs but can really succeed for people all the way from no-code solutions all the way through to advance systems solutions. And so that opportunity is really exciting.

Read more of this story at Slashdot.

Can AI Design Games? How Nick Walton Created ‘AI Dungeon’

See the original posting on Slashdot

shirappu writes: Nick Walton created AI Dungeon as an experiment to build an AI dungeon master for D&D games. Since then, it’s grown into a text adventure game in which players can type in anything they want, with the game’s AI responding dynamically [and with over 1.5 million players and multiplayer adventures.]

In this interview about the year since its release, Nick talks about how it works and what they’re working on now: quest systems, world persistence, and longer-term memory. He also opens up about where he thinks AI systems can support game development. “One of our visions for AI is not as a tool to replace game designers, but a tool for augmenting their work. We want to make it easier to create awesome games. If it only takes one or two people to build an awesome game because AI fills in the details, it opens up doors for a lot of people.
“We really want to enable people to build cool things with this tech. Deploying this kind of AI training and these massive models is really hard for the average person, so our hope is that we build out the infrastructure and platform, and then let people build cool things on top of that.”

Walton says they’ve recently added a modding feature that “basically allows for people to create custom modifications for their worlds.”
In a test run I was a cyberpunk “living in the the futuristic city of Zail. You have a bag of drugs and a holoband. You wake up in a dark alley with bruises all over your body. You have no idea what happened. You stand up and see three men pointing guns at you…”

Read more of this story at Slashdot.

WarnerMedia Is Getting Rid of the HBO Go App

See the original posting on Slashdot

The Verge reports that WarnerMedia is getting rid of the HBO Go app in an attempt to reduce some of the confusion about which app is for which purpose. From the report: HBO Max is AT&T’s new streaming service that lets you access the entire HBO library plus additional content like Cartoon Network shows and the Studio Ghibli movies. You can subscribe to HBO Max directly for a $14.99 monthly fee, but it’s also offered for free from many cable providers if you subscribe to HBO, and it’s free as part of some AT&T wireless, internet, or TV plans. A key thing to know is that HBO Max is really an expanded and rebranded version of HBO Now, the company’s previous streaming-only service. On most platforms, like Apple TV, the HBO Now app was directly updated to become HBO Max.

Before HBO Max existed, cable subscribers could stream HBO shows using an app called HBO Go. WarnerMedia will be getting rid of that app (or “sunsetting” it, in WarnerMedia’s language) from “primary platforms” as of July 31st. If you previously relied on HBO Go, many cable providers will already let you log in to HBO Max. You can see that full list here. That “primary platforms” language is important, because WarnerMedia still hasn’t struck deals to bring HBO Max to Roku or Amazon streaming devices. On those platforms, WarnerMedia is not upgrading the HBO Now app to become HBO Max. Instead, it’s rebranding to simply be “HBO,” where it will still cost $14.99, even though you’ll only be able to watch HBO content on it and not the expanded HBO Max catalog. This branding switch will be happening over the coming months, according to WarnerMedia.

Read more of this story at Slashdot.

System76 Launches AMD Ryzen-Powered ‘Serval WS’ Portable Linux Workstation

See the original posting on Slashdot

Linux computer maker/seller System76 has been offering AMD processors in its Thelio desktop computers, but believe it or not, the company has never offered an AMD-powered laptop — until now, that is. From a report: You see, starting today, you can buy a “Serval WS” powered by AMD. No, System76 is not offering mobile Ryzen chips in this laptop, but instead, desktop-class processors. As you can expect, this 15-inch portable Linux workstation is quite chunky and heavy as a result. With that said, it is simply impossible to cram this much power into a thin and light notebook. “The Serval WS comes with either the 3rd Gen Ryzen 3600, 3700X, or 3900 CPU. The latter is equipped with 12 Cores and 24 Threads, making this laptop perfect for taking on heavy computational loads. Having this kind of desktop-caliber power in a laptop body helps if you need to run complex simulations at your desk or quickly render 3D scenes while on the road. AMD CPUs are also known for having the an extremely high price per performance, which means you get maximum bang-for-buck,” says System76. The laptop starts at $1,300 — and BetaNews has all the specs.

Read more of this story at Slashdot.

The Future of Xbox Isn’t Just a Console

See the original posting on Slashdot

With the Xbox Series X on the horizon, Microsoft’s head of videogame hardware sees a future where consoles may no longer be front and center. Wired reports: Despite its massive push for the Xbox Series X, Microsoft is hedging its bets that a decade from now more and more gamers will be taking a “no gods, no masters” approach to where and how they play. Phil Spencer, head of Xbox, thinks whether consoles will exist in 10 years is the wrong question to ask. “In the long run, to me, it’s a question about the viability of the television,” said Spencer last week in an interview with WIRED. “There’s this calculus, this chess match we’re playing,” says Spencer. “It’s no longer checkers.” Spencer’s chess match isn’t against Sony or Nintendo; it’s against the ever-changing trends in how two billion gamers worldwide consume media. When the Xbox Series X arrives in stores later this year, it will become a part of Xbox’s chimera approach — alongside its cloud gaming service, Project xCloud, and Xbox Play Anywhere — to capture gamers wherever they are. With xCloud, you’ll pay a currently undefined subscription to stream AAA games onto your mobile phone and tablet. With Xbox Play Anywhere, you can buy, say, Forza Horizon 4 and play it on both Xbox One and Windows 10 on PC.
[…]
Spencer paints the Xbox Series X and the “game anywhere on stuff you have” pitches as complementary rather than cannibalistic. “I don’t think it’s ‘hardware agnostic’ as much as it’s ‘where you want to play,” he says. Which makes sense: The more ways to play, and the more services Microsoft provides, the more repeatable revenue flowing into Microsoft’s coffers. After the hype around the Xbox Series X cools down and the hardware-content singularity approaches, it’s possible that many of the people opting to play Xbox games will do so on everything except the Xbox. It seems fair to ask whether this generation of dedicated consoles will be the last. “I like watching TV. I like playing games on TV. It’s where I play most of the time,” says Spencer. “I think there will be — for a long time — a world where people want to play on a television, and we’re committed to that and we will deliver great console experiences. I don’t think Xbox series X is our last console. I think we will do more consoles to make that great television play experience work and be delightful.”

And if not, well, the company still has options. “The nice thing about being in a company the scale of Microsoft is we’re able to make bets across a lot of those fronts and we’re not really dependent upon any one of those individual kinds of businesses or relationships to succeed,” says Spencer.

Read more of this story at Slashdot.

Bryan Lunduke Explains Why Linux Sucks in 2020

See the original posting on Slashdot

Roblimo once called it “a tradition, not just a speech” — Bryan Lunduke’s annual “Linux Sucks” presentations at various Linux conferences. But before you get too upset, in his 2014 interview with Slashdot Lunduke admitted “I love Linux, I have made my whole life around Linux. I work for Linux companies. I write for Linux magazines, but it really blows…”

This year he’s releasing a special YouTube version of Linux Sucks 2020, the first time Lunduke has attempted the talk without a live audience, “And it feels really wicked weird.” But he’s still trying to get a rise out of his audience. “Follow me on this into Journey Into Graphs and Numbers Land,” Lunduke says playfully, pulling up one of his 160 x 90 pixel slides showing current market share for Windows, Mac, and then Linux “You might notice that some platforms have a higher market share than Linux does,” he says with a laugh, describing one slide showing Linux as “scooping up the bottom of the barrel at 1.6%…”

“But here’s the thing. These numbers have been either consistent, or for Linux, slowly dropping.” And then he puts up a graph showing the number of searches for Linux. “If you look back at 2004 — the year 2004, 16 years ago — that was the high point in interest in searching for the word Linux (or Linux plus other things). 2006 it was about half that — so about two years later it had dropped down to about half. Here in 2020 it is so low, not only does it not fill up the first bar of pixels there, it’s like only three pixels in. That doesn’t happen — that sort of decline does not happen — unless the platform sucks. That’s just the truth of the matter. That’s just how it goes, right?”

And there’s also some very specific reasons why Lunduke thinks Linux sucks:

Read more of this story at Slashdot.

Stack Overflow Investigates Why Developers Love Rust So Much

See the original posting on Slashdot

This year Stack Overflow’s Developer Survey of 65,000 programmers found that Rust was their most-loved programming language — for the fifth year in a row. To understand why, they interviewed the top contributor to the site’s Rust topic. (“The short answer is that Rust solves pain points present in many other languages, providing a solid step forward with a limited number of downsides…”) But Stack Overflow also reached out to the Rust core team, including Berlin-based developer Erin Power, asking about any barriers to entry, and why they think Rust was the survey’s most-loved language. (“I think it’s because Rust makes big promises, and delivers on them…”)
And finally, they got responses from Stack Overflow users in their Rust chatroom and forums, noting “Rust users are a passionate bunch, and I got some fascinating insights along with some friendly debates…”
Many current programming discussions revolve around whether to use a fast, low-level language that lets you handle memory management or a higher-level language with greater safety precautions. For fans of Rust, they like that it does both…. While some languages just add polish and ease to existing concepts, several users feel that Rust is actually doing new things with a programming language. And it’s not doing new things just to be showy; they feel these design choices solve hard problems with modern programming…

Stack Overflow user janriemer: “A quote from Chris Dickinson, engineer at npm, sums it up perfectly for me, because I have thought the same, without knowing the quote at that time: ‘My biggest compliment to Rust is that it’s boring, and this is an amazing compliment.’ Rust is a programming language that looks like it has been developed by user experience designers. They have a clear vision (a why) of the language and carefully choose what to add to the language and what to rework, while listening to what the community really wants. There are no loose ends, it’s all a coherent whole that perfectly supports a developer’s workflow.”

Stack Overflow’s post also quotes Jay Oster, a software architect at the infrastructure-as-a-service company PubNub, who argues Rust “ticks all the boxes”:

Memory safe Type safe Data race-free Ahead-of-time compiled Built on and encourages zero-cost abstractions Minimal runtime (no stop-the-world garbage collection, no JIT compiler, no VM) Low memory footprint (programs run in resource constrained-environments like small microcontrollers) Targets bare-metal (e.g. write an OS kernel or device driver; use Rust as a ‘high level assembler’)”
He also describes Rust as “akin to wandering around in complete darkness for an entire career, and suddenly being enlightened to two facts:

You are not perfect. You will make mistakes. Those mistakes will cause you a lot of problems. It doesn’t have to be this way.

Read more of this story at Slashdot.

Solving Online Events

See the original posting on Slashdot

Benedict Evans: I suspect part of the answer to this is actually that a lot of physical events will come back in some form as we emerge from lockdown. But this also makes me think that there will be new tools with much more radically new approaches, and some new behaviours and habits. Hence, it’s often struck me that networking events are pretty inefficient and random. If you’re going to spend an hour or two in a room with 50 or 500 people, then you could take that as a purely social occasion and enjoy yourself. But if your purpose is to have professionally useful conversations, then what proportion of the people in the room can you talk to in an hour and how likely is it that they’ll be the right ones? Who’s there? I sometimes suggest it would be helpful if we all wore banners, as in the image at the top, so that you could look across the room and see who to talk to. (First Tuesday did something like this in 1999, with different coloured badges.)

This might just be that I’m an introvert asking for a machine to manage human connections for me (and I am), but there is also clearly an opportunity to scale the networking that happens around events in ways that don’t rely on random chance and alcohol tolerance. A long time ago Twitter took some of that role, and the explosion of online dating also shows how changing the way you think about pools and sample sets changes outcomes. In 2017, 40% of new relationships in the USA started online. Next, before lockdown, you would often have planned to schedule a non-urgent meeting with a partner or client or connection ‘when we’re in the same city.’ That might be at some specific event, but it might also just be for some ad hoc trip — ‘next time I’m in the Bay Area’ or ‘next time you’re in New York.’ In January most people would never actually have thought of making a video call, but today every meeting is a video call, so all of those meetings can be a video call too, and can happen this week rather than ‘next time I fly to that city’ — or ‘at CES/NAB/MIPCOM.’ In the last few months video calls have broke through that habit. I wonder what happens if we accelerate all of those meetings in that way. To argue against some of this, James Turrell has said that part of the value of Roden Crater’s remoteness is that you have to really care to go there. Getting a plane and a hotel and a ticket, and taking days of time, has some of the same effect for a conference — it gives a selection filter for people who care. There is value in aggregating people around a professional interest graph, and in doing that in a focused way, perhaps even around a particular time. (There are also, of course, exclusionary effects to this.)

Read more of this story at Slashdot.

‘Lord of the Rings’ Reunion Brings Actors, Director, Writers Together on Zoom

See the original posting on Slashdot

“Just about the entire cast of The Lord of the Rings gathered their Zoom screens together for a reunion nearly two decades after the end of the epic fantasy film trilogy,” reports CNET:

io9 notes that it was comic actor-singer Josh Gad who “gathered the hobbits, the wizards, the elves, and the wicked menfolk to go to Isen — YouTube, where they joke, talk shop, reminisce, and just seem to really thoroughly enjoy each others’ presence. In this stream are Elijah Wood [Frodo], Sena Astin [Sam], Ian McKellen [Gandalf], Orlando Bloom [Legolas], Viggo Mortensen [Aragorn], Liv Tyler [Arwen], and more, along with director Peter Jackson and, presumably, the kind doting ghost of J.R.R. Tolkien just off-screen.”
The Wrap has more details, including the fact that the event was to support No Kid Hungry, a charity in support of ending childhood hunger, and some ways they changed J.R.R. Tolkein’s book for the movie:

“Gandalf does not say, ‘You shall not pass!’ in the book,” McKellen notes. “He says, ‘You will not pass.” [Co-writer Philippa] Boyens also notes that Gandalf’s first line in the trilogy was one she came up with herself, instead of coming from Tolkien: “A wizard is never late, Frodo Baggins. Nor is he early. He arrives precisely when he means to.”

Another moment from the trilogy that gained Internet immortality was Boromir’s famous “One does not simply” speech, where he warns the Council of Elrond that trying to sneak into Mordor to destroy the One Ring is impossible. Jackson admits that the speech was written the day before the council scene was filmed and while Sean Bean’s speech was done so well that it became a meme, he needed some help to remember it.
“What Sean did, which I thought was really clever, is he got a print-out of the speech taped to his knee,” Jackson said, pointing out Bean places his hand to his head to display Boromir’s sense of despair. “If you watch the scene now, you’ll see every time that Sean has to check his script.”

Read more of this story at Slashdot.

82-Year-Old Ridley Scott Shares Some Secrets About ‘Alien’

See the original posting on Slashdot

Ridley Scott was the fifth choice to direct the 1979 film Alien, remembers the Los Angeles Times, “meaning that no one was expecting the film to become as important and influential as it now is.”
This week they chronicled some more remembrances about the film from 82-year-old Ridley Scott:

The central role of Ellen Ripley — also portrayed by Sigourney Weaver in three subsequent sequels — was originally written as a man… “I think it was Alan Ladd [then president of 20th Century Fox] who said, ‘Why can’t Ripley be a woman?’ And there was a long pause, that at that moment I never thought about it. I thought, why not, it’s a fresh direction, the ways I thought about that. And away we went… I found Sigourney by word of mouth. Somebody had been told that Siourney was on an off-Broadway stage doing something, that I should meet. And I did,” Scott said. “And there it was, she was perfect. In terms of scale, size, intelligence, her acting is just fantastic. And so it was made for her, really.”

The film’s notorious chest-burster scene, in which an alien creature emerges from within actor John Hurt’s chest, is now among the classic scenes in modern horror cinema. It was shot with multiple cameras because Scott could only really perform the full effect once, “because once I blew blood all over that set, there was no cleaning it up… I kept it very much from the actors and I kept the actual little creature, whatever that would be, from the actors. I never wanted them to see it,” Scott said. “Remember there was no digital effects in those days at all. I’m going to somehow bring that creature out of his chest….”

Scott recalled the influence that Star Wars had on him at the time, noting, “It opened the gate for me feeling comfortable that science fiction was no longer silly fantasy but actually had a reality to it… So I was blown away… My hat still comes off to George,” Scott said of Lucas for the first Star Wars. “Without question his was by far the best, still.”

Scott directed the 2017 film Alien: Covenant, the Times notes, “And he may not be done yet.
“What I always thought when I was making it, the first one, why would a creature like this be made and why was it traveling in what I always thought was a kind of war-craft, which was carrying a cargo of these eggs. What was the purpose of the vehicle and what was the purpose of the eggs? That’s the thing to question — who, why, and for what purpose is the next idea, I think.”

Read more of this story at Slashdot.

Dell’s All-AMD Gaming Laptop Hailed as a ‘Budget Blockbuster’

See the original posting on Slashdot

AMD “has a potent combination of both CPU and GPU technologies,” writes Slashdot reader MojoKid, that “can play well in the laptop market especially, where a tight coupling of the two processing engines can mean both performance and cost efficiencies.”

One of the first all-AMD laptops to hit the market powered by the company’s new Ryzen 4000 mobile processors is the Dell G5 15 SE, it’s a 5.5 pound, 14.4-inch machine [with a 15.6-inch display] that sports an understated design for a gaming notebook but with an interesting glittery finish that resists fingerprints well. With a retail price of $1199 (starting at $879), the model tested at HotHardware is powered by an AMD Ryzen 4800H 8-core processor that boosts to 4.2GHz and an AMD Radeon RX 5600M mobile GPU with 6GB of GDDR6 memory…

In the benchmarks, AMD’s SmartShift technology load-balances CPU and GPU power supply for optimal performance and very respectable numbers that are competitive with any similar Intel/NVIDIA powered machine. The Dell G5 15 SE put up north of 60 FPS frame rates at maximum image quality in current-gen game titles, but with a significantly better price point, relatively speaking.

The GPU also has 2,304 stream processors across 36 compute units, and “Overall, we think Dell hit it out of the park with the new G5 15 SE,” the review concludes.
“This all-AMD budget blockbuster has all of the gaming essentials: a fast processor, a powerful GPU, and a 144 Hz display.”

Read more of this story at Slashdot.

Why You Shouldn’t Make a Habit of Force-Quitting iOS Apps or Restarting iOS Devices

See the original posting on Slashdot

Adam Engst, writing for TidBITS: Because force-quitting apps and restarting or shutting down devices are necessary only to fix unanticipated problems, there are two notable downsides to engaging in such behavior as a matter of habit: reduced battery life and wasted time. Why would these behaviors reduce battery life? Remember, iOS is a modern operating system that’s built on top of Apple’s proprietary hardware. Apple has put a great deal of effort into ensuring that iOS knows the best ways to manage the limited hardware resources within your iPhone or iPad. No one, possibly short of an iOS systems engineer armed with Apple’s internal diagnostic and debugging tools, would be able to outguess iOS itself on issues like memory usage, power draw, and CPU throttling.

When you invoke the App Switcher in iOS, you can swipe right to see all the apps you’ve used, possibly since you got your device. (The very first app in my iPhone 11 Pro’s App Switcher is Apple’s Tips, which I think came up automatically when I turned the iPhone on last year and hasn’t been touched since. It’s difficult to count apps in the App Switcher, but I probably have at least a hundred in there.) As the number of apps in the App Switcher should indicate, those apps are not necessarily running — they merely have run at some point in the past. They’re much more like the contents of the Mac’s Apple > Recent Items menu. In normal usage, iOS devotes the lion’s share of CPU and memory resources to the app that you’re using. That’s sensible — the performance of that app is paramount. However, the next few apps in the App Switcher may also be consuming some CPU and memory resources. That’s because iOS correctly assumes that you’re most likely to return to them, and it wants to give you the best experience when you do. The screen shouldn’t have to redraw multiple times, Internet-loaded content shouldn’t have to update, and so on. […]

Read more of this story at Slashdot.

1 2 3 4 149