By: defaltluser (no.delete@this.thanks.net), November 2, 2020 5:21 pm
Room: Moderated Discussions
Mark Roulo (nothanks.delete@this.xxx.com) on October 28, 2020 7:09 pm wrote:
> anon (an.delete@this.n.net) on October 28, 2020 3:43 pm wrote:
> > defaltluser (no.delete@this.thanks.net) on October 28, 2020 2:46 pm wrote:
> > > anon (anon.delete@this.anon.anon) on October 27, 2020 8:52 pm wrote:
> > > > There is another cost to mergers / acquisitions, and that is focus. Intel is in more
> > > > desperate straits now and thus is forced to focus, while AMD doesn't have that sort
> > > > of pressure at this time, so buys Xilinx and thus spreads itself around more, losing
> > > > focus. Unless there is clear synergistic advantage, mergers should NOT happen.
> > >
> > > Intel is divesting Flash because 3D Xpoint has made it redundant. Also the western phone
> > > market is saturated even before Covid (and flash prices have cratered afterward).
> > >
> > > Intel are not divesting their 5-year-old investment in Altera, because there is benefit
> > > in adding built-in programmable cores to compute processors. If Intel thinks there is
> > > value there, then AMD buying it's big brother would tend to make a lot of sense.
> > >
> >
> > Intel and AMD also believed there was value in "GPGPU", so Intel started developing
> > Larrabee and AMD acquired ATI, but in the end both Larrabee and HSA failed.
>
> Larrabee was an Intel attempt to create a discrete GPU without hardware Raster Output Units, not a
> general purpose GPGPU chip. It was subsequently re-targeted at GPGPU loads, but it was never competitive
> with NVidia offerings. But Intel did not start Larrabee because Intel believed in GPGPU.
>
> AMD purchased ATI in 2006. NVidia released CUDA in 2007 and AMD never showed any serious
> intention of providing a software stack to make effective use of ATI GPUs for GPGPU
> loads. It would be tough to argue that AMD purchased ATI because AMD believed there
> was value in GPGPU given the subsequent lack of any interest in pushing GPGPU.
>
> *Both* Larrabee and the ATI acquisition began as straightforward GPU plays.
>
> Larrabee just failed as a GPU.
>
> And AMD took so long to make APUs containing both AMD x86 CPUs and ATI GPUs that Intel managed to release
> iGPU products in 2010 (Westmere cores with HD Graphics) before AMD got out its first APU in 2011.
>
>
Technically, AMD cheated to ALMOST deliver the first modern APU before Intel.
They rushed-out Bobcat from TSMC, so they could completely avoid the SOI transition delays that Llano was fighting. But it took until Jaguar before that chip was serviceable quad-core (GCN-based)
But yeah, this guy is an idiot who doesn't seem to recall AMD paid most of the price for ATI out-of-pocket.
> anon (an.delete@this.n.net) on October 28, 2020 3:43 pm wrote:
> > defaltluser (no.delete@this.thanks.net) on October 28, 2020 2:46 pm wrote:
> > > anon (anon.delete@this.anon.anon) on October 27, 2020 8:52 pm wrote:
> > > > There is another cost to mergers / acquisitions, and that is focus. Intel is in more
> > > > desperate straits now and thus is forced to focus, while AMD doesn't have that sort
> > > > of pressure at this time, so buys Xilinx and thus spreads itself around more, losing
> > > > focus. Unless there is clear synergistic advantage, mergers should NOT happen.
> > >
> > > Intel is divesting Flash because 3D Xpoint has made it redundant. Also the western phone
> > > market is saturated even before Covid (and flash prices have cratered afterward).
> > >
> > > Intel are not divesting their 5-year-old investment in Altera, because there is benefit
> > > in adding built-in programmable cores to compute processors. If Intel thinks there is
> > > value there, then AMD buying it's big brother would tend to make a lot of sense.
> > >
> >
> > Intel and AMD also believed there was value in "GPGPU", so Intel started developing
> > Larrabee and AMD acquired ATI, but in the end both Larrabee and HSA failed.
>
> Larrabee was an Intel attempt to create a discrete GPU without hardware Raster Output Units, not a
> general purpose GPGPU chip. It was subsequently re-targeted at GPGPU loads, but it was never competitive
> with NVidia offerings. But Intel did not start Larrabee because Intel believed in GPGPU.
>
> AMD purchased ATI in 2006. NVidia released CUDA in 2007 and AMD never showed any serious
> intention of providing a software stack to make effective use of ATI GPUs for GPGPU
> loads. It would be tough to argue that AMD purchased ATI because AMD believed there
> was value in GPGPU given the subsequent lack of any interest in pushing GPGPU.
>
> *Both* Larrabee and the ATI acquisition began as straightforward GPU plays.
>
> Larrabee just failed as a GPU.
>
> And AMD took so long to make APUs containing both AMD x86 CPUs and ATI GPUs that Intel managed to release
> iGPU products in 2010 (Westmere cores with HD Graphics) before AMD got out its first APU in 2011.
>
>
Technically, AMD cheated to ALMOST deliver the first modern APU before Intel.
They rushed-out Bobcat from TSMC, so they could completely avoid the SOI transition delays that Llano was fighting. But it took until Jaguar before that chip was serviceable quad-core (GCN-based)
But yeah, this guy is an idiot who doesn't seem to recall AMD paid most of the price for ATI out-of-pocket.