<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

2022 review: Elon Musk buys Twitter and we untangle AI

4 minute read

Happy New Year! We finish our round up of 2022 stories with a look at why technology doesn't succeed unless it's easy to use, and ask why is there so much confusion over AI?

When Elon Musk took over Twitter in what was perhaps the takeover nobody was waiting for, it prompted a sizeable portion of users to look elsewhere. Increasingly it was Mastodon that received the largest amounts of Twitter migrant signups, but it quickly became apparent that it wasn't as simple as many would expect.

Mastodon is a different beast to Twitter, and is not as easy to understand or use, prompting us to examine the conditions upon which technology succeeds, and when it doesn't.

"The lesson any tech company can learn is that products only succeed when they are easy to use, convenient, cost effective, and can adapt. VHS wasn't as good as Betamax, but the cassettes were smaller, and so were the recorders. VHS was cheaper and offered longer record times, and Sony's bet that the better image quality of Beta would win over consumers, despite the drawbacks, was a colossal misjudgement."

Read Tech only succeeds when it's easy - and Mastodon isn't...

Why is AI so hard to understand?

The concept of AI is a tricky one to grasp, even for computing experts. After all, nothing fundamental has changed about the way computers operate. They are still built upon transistors, but it's the speed of modern chips that has fuelled AI development. Yet, how do we understand how AI systems actually work? How do we differentiate between the different types of AI?

David Shapton began a new series explaining the AI phenomenon, and how to understand it.

"The tricky thing about understanding AI is that there's a massive gap between our intuitive knowledge of circuits and software and the apparent ability of machines to "think". But machines don't "think" yet. It's just that they're starting to look like they might."

Read Why is AI so hard to understand?

The NASA cameras capturing Artemis I and Orion

artemis moon

Meanwhile, back in space NASA launched a mission to the moon, and it packed the transportation platforms full to the brim of cameras. 24 to be precise, to capture every moment.

Andy Stout wrote: "NASA, of course, knows a few things about selling the science and has made sure that cameras are an integral part of its mission parameters since the early days. Few missions though have been festooned with quite as many imaging devices as Artemis I. All in all, according to a detailed blog post published after the launch, there are 24 cameras on the rocket and spacecraft – eight on the SLS and 16 on Orion – to document essential mission events including liftoff, ascent, solar array deployment, external rocket inspections, landing and recovery, as well as capture images of Earth and the Moon."

Read The NASA cameras capturing Artemis I and Orion.

Why we need new numbers to handle the sheer amount of future data

abacus

Sticking with a science type theme, apparently we are now producing so much data that new numbers have had to be invented. Ever heard of a yottabyte? No, me neither, but now I do, and it equates to 10^24, or 1 000 000 000 000 000 000 000 000 bytes.

Andy Stout described it thus, "It’s a lot of data, so much so that a) Nature points out that a stack of DVDs holding it would stretch to Mars and b) the number that describes it wasn’t even given a name until 1991 when the Bureau International des Poids et Mesures bequeathed the world the yotta and the zetta prefixes."

But, that's not enough to describe what might come next. As a result the participants at the 22nd General Conference on Weights and Measures (CGPM) voted on the naming scheme of some new number prefixes to deal with the data onslaught.

Read Why we need new numbers to handle the sheer amount of future data.

Don't buy a C class drone yet: UK abandons new drone classes

Mavic_3_Classic

On planet earth, sort of, UK drone operators were lamenting the decision by the CAA not to adopt the new EU based drone classification system, hanging some professional operators out to dry in the process.

The new regulations would have simplified things greatly, as well as giving much more flexibility on where drones can be flow, particularly lightweight ones. By abandoning the new regulations, the CAA has placed greater restrictions on UK pilots vs their EU counterparts.

Read Don't buy a C class drone yet: UK abandons new drone classes.

With ChatGPT's public preview AI has just taken a huge step forward

chatGPT

AI took yet another step forward and grabbed the world's attention with the public preview release of ChatGPT 3.5. Certainly, if you inhabit the Twittersphere you couldn't escape the many demonstrations of its prowess. From solving programming issues to writing screenplays, it seemed that ChatGPT had approached Skynet levels of sophistication. Indeed, we produced our own article using it.

It wasn't all as it seemed, however. Some of its responses became predictable, and Stack Overflow had to ban its use due to the incorrect coding it was producing. That said, ChatGPT is still a huge step forward, and because it can learn, it will get better and better. And that's something that many of us aren't fully prepared for.

David Shapton wrote, "It makes Siri look like Baird's mechanical TV in an age of 4K. And it makes Google's future look questionable, to say the least. After all, why would anyone want a terse set of approximate results possibly skewed by advertising revenue when you could have an academically researched response that's easy to read and factual?"

Read With ChatGPT's public preview AI has just taken a huge step forward.

Can nuclear fusion solve all our energy worries?

nuclear fusion

2022 had one more surprise up its sleeve. The running joke is that nuclear fusion is always 30 years away, but this trope may well have just been broken. The problem with fusion isn't that we can't do it, but that experiments up until now have always required more energy to be put into them than can comes out. The holy grail is a thing called ignition.

A team at the Lawrence Livermore National Laboratory in California recently announced that it had managed to generate 3.15MJ of energy from an input of 2.05MJ. It's a minuscule amount of energy, but the point here isn't just that the experiment produced more energy than was placed into it, but that it achieved ignition, in other words it became self sustaining.

Recent projects to construct fusion reactors have taken a different tactic, with some not aiming to achieve ignition at all, First Light Fusion, for instance, is looking at a method called projectile fusion. According to First Light Fusion, "Projectile fusion is a new approach to inertial fusion that is simpler, more energy efficient, and has lower physics risk. Inertial fusion is a pulsed process, where, like an internal combustion engine, a small amount of fuel is injected and sparked to make it burn. The main existing approach to inertial fusion uses a large laser as the "spark plug", triggering the reaction. We use a high velocity projectile instead."

This isn't the only alternative approach, either, with several private start-ups constructing different types of reactors to test out their approaches. Regardless, the recent breakthrough with ignition based fusion is a huge milestone.

Read Can nuclear fusion solve all our energy worries?

And with that, RedShark wishes all its readers the very best for the new year. Brace yourself, because if this year was anything to go by, we haven't seen anything yet.

Tags: Technology Production

Comments