Monday, January 6th 2020

Intel 2020 CES Conference: Live Blog

Intel is presenting its 2020 International CES address by CEO Bob Swan. The company pledges relentless innovation and adherence to Moore's Law to introduce new technology. This is a live-blog of the event.

01:16 UTC: And that's a wrap on Intel's CES event. Not even a passing mention of Comet Lake desktop processor. They had time for athlete tracking but not their 10th gen mainstream desktop product-line. Intel has officially ceded this market to AMD!

01:10 UTC: Intel Unveils DG1, it's first discrete GPU in the mobile form-factor. Can work in tandem with iGPU.
01:07 UTC: Xe iGPU doubles graphics performance vs. Gen11. Advancements in display- and media-capabilities and AI-assisted image quality enhancements that leverages DLBoost.
01:00 UTC: Intel unveils "Tiger Lake" processors. Combines "Willow Cove" CPU cores with Xe iGPU, features Thunderbolt 4 and Wi-Fi 6.
00:59 UTC: Intel "Horseshoe Bend": A large 17-inch foldable PC.
00:57 UTC: Lenovo also uttered "Core Hybrid," possibly commercial name of Lakefield (x86 big.LITTLE).

00:54 UTC: Lenovo shows off Yoga Slim 7 and ThinkPad X1 Carbon, along with a new all-day device, a full-performance, 5G-capable, foldable-PC that weighs less than 1 kg.
00:53 UTC: Project Athena extended to Google Chromebook ecosystem.

00:51 UTC: HP Dragonfly is a super-thin Project Athena laptop with a 360-degree hinge, powerful hardware, and made from recycled plastic from the ocean.
00:46 UTC: Gregory Bryant of client-computing group takes stage.
00:29 UTC: AI a $25 Billion opportunity by 2025. Shenoy details what Habana Labs acquisition means. Xeon Scalable processor is a solid foundation for AI, thanks to DLBoost and AVX-512.
00:25 UTC: Intel and Netflix closer to deploying AV1 codec: a bandwidth-friendly codec for the streaming content leader. AV1 will roll out in 2020.
00:22 UTC: Navin Shenoy (Enterprise computing head) takes stage.

00:17 UTC: Intel's self-driving car tech has advanced. Driving an autonomous vehicle across traffic in Jerusalem. AI is also helping the American Red Cross Society navigating a disaster-hit zone. AI analyzes satellite images to draw out roads and directions.
00:08 UTC: Intel predicts an exponential growth of data up to 175 ZB by 2025, and on average, 7 devices per human being, all generating data.
Add your own comment

30 Comments on Intel 2020 CES Conference: Live Blog

#1
Daven
INTC has an Olympic gold medalist on stage!....
Posted on Reply
#2
Dave65
Company pledges relentless innovation :
I'll believe it when I see it.
Posted on Reply
#3
Fouquin
Ugh that DG1 tease at the end. "It's here and it runs Destiny 2."

Great, so it performs as well as a mid-tier GPU from any time in the last 8 years.
Posted on Reply
#7
opteron
"Update 01:16 UTC: And that's a wrap on Intel's CES event. Not even a passing mention of Comet Lake desktop processor. They had time for athlete tracking but not their 10th gen mainstream desktop product-line. Intel has officially ceded this market to AMD! "


So bad that that there's no mention of any of the crappy * Lake CPUs, lol. This could be a record...
Posted on Reply
#8
notb
thesmokingmanInnovation.. haha right!
Well, they mentioned some new technologies - from CPU architectures to interfaces. They talked about implementing AI, about new codec.
AMD showed 3 SoCs based on architectures we've known for months.

OK, this may not be as interesting for TPU community as AMD's keynote. It wasn't focused on gaming. It didn't tell us a lot about future desktops.

But don't you think you're slightly abusing the word "innovation"? :D
Posted on Reply
#9
DeathtoGnomes
Update 01:10 UTC: Intel Unveils DG1, it's first discrete GPU in the mobile form-factor. Can work in tandem with iGPU
mobile? as in Thunderbolt?
Posted on Reply
#10
notb
DeathtoGnomesmobile? as in Thunderbolt?
Mobile as in a discrete chip made for laptops. Does it really need an explanation?
Posted on Reply
#11
R0H1T
notbAMD showed 3 SoCs based on architectures we've known for months.
Ah, didn't realize you already knew that mobile Ryzen was gonna be monolithic i.e. without chiplets. Do you also have any insider info about 5nm, Zen4 or post Navi world?
notbMobile as in a discrete chip made for laptops.
Just from that line it's not clear at all, unless you were also viewing the live event!
Posted on Reply
#12
notb
R0H1TAh, didn't realize you already knew that mobile Ryzen was gonna be monolithic i.e. without chiplets. Do you also have any insider info about 5nm, Zen4 or post Navi world?

Just from that line it's not clear at all, unless you were also viewing the live event!
I don't know what you mean in neither.

AMD was focused on product: they talked about 4800U/4800H, 5600XT and a bit about 3990X. That's it.
Intel focused on tech and features.

Seriously, what is not clear in chips made for laptops? You're aware of their existence, right?
Posted on Reply
#13
Valantar
Not to sound harsh, but you should work on your live blogging game. This was far too short on details, it reads like a bullet point summary of a bullet pointed presentation, i.e. too broad to reveal anything interesting. What did they say about the things they showed off?
Posted on Reply
#14
notb
ValantarNot to sound harsh, but you should work on your live blogging game. This was far too short on details, it reads like a bullet point summary of a bullet pointed presentation, i.e. too broad to reveal anything interesting. What did they say about the things they showed off?
I believe the main goal was to finish it with "AMD rulezzz".

That's how adults do it:
www.anandtech.com/show/15338/intel-ces-2020-keynote-live-blog-ice-comets-and-more-to-come
Posted on Reply
#16
Vya Domus
ValantarNot to sound harsh, but you should work on your live blogging game. This was far too short on details, it reads like a bullet point summary of a bullet pointed presentation, i.e. too broad to reveal anything interesting. What did they say about the things they showed off?
You also have to take into account the material at hand which is, well, not that interesting. I'd rather avoid the boilerplate.
Posted on Reply
#17
notb
Vya DomusYou also have to take into account the material at hand which is, well, not that interesting. I'd rather avoid the boilerplate.
Not that interesting to you maybe...
AI, autonomous cars, connectivity, AV1, new form factor (folding x86 tablets) and so on. If you're into tech, how could that not be interesting?

If you're just into gaming and cinebench, the other company did just that. So something for everyone.

I watched the whole AMD thing and was bored to death (and finished off by the stuttering presenter).
Posted on Reply
#18
NeuralNexus
notbNot that interesting to you maybe...
AI, autonomous cars, connectivity, AV1, new form factor (folding x86 tablets) and so on. If you're into tech, how could that not be interesting?

If you're just into gaming and cinebench, the other company did just that. So something for everyone.

I watched the whole AMD thing and was bored to death (and finished off by the stuttering presenter).
I watched a good chunk of the INTEL presentation and laughed through it. It was desperate by Bob Swan and the rest to remain relevant. Coffin Lake is all they have in the pipeline because once AMD releases Zen 3 this year. It'll be a wrap for INTEL on desktop and server.
Posted on Reply
#19
notb
NeuralNexusI watched a good chunk of the INTEL presentation and laughed through it. It was desperate by Bob Swan and the rest to remain relevant. Coffin Lake is all they have in the pipeline because once AMD releases Zen 3 this year. It'll be a wrap for INTEL on desktop and server.
Are you interested in anything other than CPU core count? :)

If I correctly recall consensus on this forum, the first Ryzen in 2017 already killed Intel. So Zen3 should change much, right? :)
Posted on Reply
#20
Vya Domus
NeuralNexusIt was desperate by Bob Swan and the rest to remain relevant.
It's not just that, they are also desperately trying to diversify and have been doing so for the last couple of years. Question is why ? The first thing that comes to mind is that they've saturated already all markets where they operate but it's not that simple. They clearly have encountered some trouble that would put that at a risk and yet they're still going full force in these other fields as if they're not very confident in their traditional businesses.

Not only that but they have a really bad record of trying to make a dent in various industries. Food for thought.
Posted on Reply
#21
notb
Vya DomusIt's not just that, they are also desperately trying to diversify and have been doing so for the last couple of years. Question is why ? The first thing that comes to mind is that they've saturated already all markets where they operate but it's not that simple
Intel's strategy was published not so long ago. Google it.
They have some revenue targets and they need to tackle markets other than CPUs to meet them.

At the same time CPUs lost a lot of profit margin in the last 2 years (because of AMD's offensive) which means there's even more need to try something else.
Sure, not every endeavour worked for them, but some things definitely did.
You try 4 things, one sticks, you get rich. Business.

From your perspective, as a desktop user and a gamer, it would be best if Intel focused on desktop CPUs.
From my perspective, as a laptop user and a non-gamer, desktop CPUs hardly matter. I prefer how they push AI chips, Thunderbolt 4 and WiFi 6.
But from their perspective the goal is to make money. And they'll do whatever can lead to that. If trading bananas has more potential than making CPUs, they'll switch to trading bananas.
Posted on Reply
#22
Valantar
Vya DomusYou also have to take into account the material at hand which is, well, not that interesting. I'd rather avoid the boilerplate.
That's mostly true, but how exactly are we supposed to figure that out with press coverage of this character? For all we know there might have been interesting details about a lot of this stuff, but it's impossible to tell from this blog.
notbNot that interesting to you maybe...
AI, autonomous cars, connectivity, AV1, new form factor (folding x86 tablets) and so on. If you're into tech, how could that not be interesting?

If you're just into gaming and cinebench, the other company did just that. So something for everyone.

I watched the whole AMD thing and was bored to death (and finished off by the stuttering presenter).
-AI is a BS marketing term that ultimately means "fancy algorithms" with little-to-no real-world value (outside of datacenters, at least for now) in the areas they choose to focus on.
-Autonomous cars are at least a decade away from market readiness and regulatory approval. Besides, not all people have or want cars, let alone want them autonomous.
-AV1 is ... a video codec. Not quite keynote material, at least not by itself. It's a far smaller improvement than H.264 to H.265, and it's not like it's being announced alongside a dramatically improved encode/decode block or anything like that.
-New form factors are only relevant if a) they are real (as in not concept devices that never materialize), and b) have some actual utility to them. The Thinkpad X1 Fold looks very interesting, while Dell's two-screen XPS concept looks ...meh. Dual-screen, keyboardless laptops always prove to be less useful and less practical than people think. Inventing marginal use cases never makes for successful big-budget products, so the go-to idea must be improving on current use cases or increasing flexibility without sacrificing usefulness.
notbFrom your perspective, as a desktop user and a gamer, it would be best if Intel focused on desktop CPUs.
From my perspective, as a laptop user and a non-gamer, desktop CPUs hardly matter. I prefer how they push AI chips, Thunderbolt 4 and WiFi 6.
But from their perspective the goal is to make money. And they'll do whatever can lead to that. If trading bananas has more potential than making CPUs, they'll switch to trading bananas.
a) You don't go to the Consumer Electronics Show to talk about your business plan. You go there to show your consumer electronics. At this point you're making excuses for them not having anything exciting to show off. WiFi 6 is not an Intel technology. TB4 is ... for now not a thing? It'll probably be good when it arrives, but I struggle to see how it will be more useful than TB3. And if it's more expensive than TB3, it'll be a dud.
b) What, exactly, are you going to be using "AI" for in your laptop? And how; what hardware will it be running on?
Posted on Reply
#23
Vya Domus
Valantar-Autonomous cars are at least a decade away from market readiness and regulatory approval.
You're very optimistic, there's a chance they'll never get there. Everyone thought that they can just throw more data at AI problems and you'll eventually get there but it doesn't work like that, autonomous cars still can't do a lot in a reliable fashion.
Posted on Reply
#24
notb
Valantar-AI is a BS marketing term that ultimately means "fancy algorithms" with little-to-no real-world value (outside of datacenters, at least for now) in the areas they choose to focus on.
Sorry, but I don't understand how a thinking adult can write something like this in 2020.
Yes, ML/AI is just about some "algorithms" - like everything we do on computers. If you don't know any use case, google will help you...
-Autonomous cars are at least a decade away from market readiness and regulatory approval. Besides, not all people have or want cars, let alone want them autonomous.
Not everyone wants a desktop as well. I don't understand this argument...
Same with the "decade away". So what? We should wait for regulations and develop then? How would that evem work? :o
-AV1 is ... a video codec. Not quite keynote material, at least not by itself. It's a far smaller improvement than H.264 to H.265, and it's not like it's being announced alongside a dramatically improved encode/decode block or anything like that.
Imagine a situation where Netflix uses AV1 and only Intel has a hardware decoder. Youtube may be going for AV1 as well...
For these streaming companies making more efficient codecs is the primary way to save costs.
a) You don't go to the Consumer Electronics Show to talk about your business plan. You go there to show your consumer electronics. At this point you're making excuses for them not having anything exciting to show off. WiFi 6 is not an Intel technology. TB4 is ... for now not a thing? It'll probably be good when it arrives, but I struggle to see how it will be more useful than TB3. And if it's more expensive than TB3, it'll be a dud.
And Intel's presentation was all about "consumer electronics", just from the tech point of view (not final products). AMD's presentation was mostly about gaming and occasionally about "creators". And just a small part of consumers does that.
b) What, exactly, are you going to be using "AI" for in your laptop? And how; what hardware will it be running on?
Most people associate AI with autonomous cars, robots, terminators etc. It's just not true. :)

Dell just announced XPS laptops will learn how you use them, to optimize your workflow over time. We'll see more and more applications like that.
So your laptop kind of "thinks". But it's not really "Skynet scenario", is it?

The "AI" umbrella term includes ML. And that's what I do most of the time (at work and as a hobby). I benefit from the ML boost libraries and ML accelerators.
As for "AI" itself, i.e. when the computer just decides how to do stuff - it's really nothing new. Photo/video editing software has been doing that for a long time. It's just that today we have chips and libraries that make this faster.

And there are many more possible uses.
Maybe virus scanners will benefit. Maybe some productivity software will get faster (it already does). Maybe my laptop will last longer on battery. It's all great.
Posted on Reply
#25
Valantar
notbSorry, but I don't understand how a thinking adult can write something like this in 2020.
Yes, ML/AI is just about some "algorithms" - like everything we do on computers. If you don't know any use case, google will help you...
Okay, here's a challenge: show me a real-world use case of AI with relevance for a relatively ordinary person that isn't already being done by conventional algorithms. Please. 'Cause I haven't seen a single one. I'd be impressed if you could even find one where a significant and relevant performance improvement can be seen.
notbNot everyone wants a desktop as well. I don't understand this argument...
Same with the "decade away". So what? We should wait for regulations and develop then? How would that evem work? :eek:
Sure, not everyone wants a desktop PC, so we don't expect car makers to talk about them in their keynotes, do we? And no, regulators don't (usually) develop technologies, but consumers shouldn't be being fed decade-or-more-away vaporware to create excitement for tech that might never show up either. All the hype for autonomous cars is just that: hype.
notbImagine a situation where Netflix uses AV1 and only Intel has a hardware decoder. Youtube may be going for AV1 as well...
For these streaming companies making more efficient codecs is the primary way to save costs.
This is nothing new, all streaming providers have migrated across codecs already, most from something previous to H.265 to H.265. The thing you're missing is that they don't scrub their libraries of the other formats when this is done. You're trying to make this out as some situation where end users will actually notice the transition, which it isn't - it just means that Netflix can plan to scrub legacy formats from their libraries X years ahead when most people have moved on to hardware supporting AV1. This has zero impact on consumers. What do I care if Netflix or YouTube saves money? YT is free, and there's no way those savings are doing anything but padding Netflix's bottom line.
notbAnd Intel's presentation was all about "consumer electronics", just from the tech point of view (not final products). AMD's presentation was mostly about gaming and occasionally about "creators". And just a small part of consumers does that.
Intel's presentation was about potential consumer electronics in various fields that either don't exist or don't work. Fiction at best. And, key, their major field of expertise, which also tends to consist of real consumer electronics, was barely mentioned whatsoever.
notbMost people associate AI with autonomous cars, robots, terminators etc. It's just not true. :)
I know. There's nothing revolutionary about AI whatsoever, just a massive hype train over fancy algorithms that don't do much new.
notbDell just announced XPS laptops will learn how you use them, to optimize your workflow over time. We'll see more and more applications like that.
So your laptop kind of "thinks". But it's not really "Skynet scenario", is it?
... you actually believe that? Tell me, how is your laptop supposed to "optimize your workflow"? In what way? This is typical vague nonsense that will never, ever pan out. Is it supposed to tell you when to take toilet breaks and when to drink more coffee, or nag you when you've been spending too much time talking to your colleagues?

There are some very tiny improvements that can be made, like pre-loading applications into memory if behavior is recognized that could be a precursor to using that application, but ... that's not going to make any kind of difference unless all you do every day is open and close slow-loading applications.
notbThe "AI" umbrella term includes ML. And that's what I do most of the time (at work and as a hobby). I benefit from the ML boost libraries and ML accelerators.
Good for you. You belong to a tiny niche even within the PC enthusiast space, which is itself a tiny niche of humanity. For the rest of us, this has pretty much zero tangible benefit.
notbAs for "AI" itself, i.e. when the computer just decides how to do stuff - it's really nothing new. Photo/video editing software has been doing that for a long time. It's just that today we have chips and libraries that make this faster.
Which is exactly what I was saying. They're trying to sell (slightly fancy) algorithms as something brand new and revolutionary, when the fact is that real-world improvements from this tech are ... tiny. Outside of datacenters and research, at least.
And there are many more possible uses.
Maybe virus scanners will benefit. Maybe some productivity software will get faster (it already does). Maybe my laptop will last longer on battery. It's all great.
[/QUOTE]
Sure, some stuff will get marginally faster, some stuff will get marginally better, and things will improve over time. I'm all for smarter battery management and similar systems, as most computer infrastructure is shockingly dumb in a lot of ways. But none of this is anything close to the revolutionary PR BS they are trying to sell it as. AI is not a new computing paradigm, it's just more of the same. Which is fine - what we already have is pretty frickin' cool. But slight improvements to various parts of it isn't going to revolutionize anything, so they should really stop saying that.
Posted on Reply
Add your own comment
May 23rd, 2024 12:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts