People have complained that this generation of game consoles and the past one are too similar to computers... and while both the Xbox Series S/X and the Playstation 5 do indeed use slightly customized x64 computer CPUs, it's important to note that the line dividing home computers and video game systems was blurred from the start.
In the 1970s, technology was still in the early stages of development, and hugely expensive. Consumer electronics like video games were massively primitive to offset those heavy costs. The first home game console was Ralph Baer's Odyssey, not much more than an electronic Etch-A-Sketch with pin jumpers for cartridges and plastic overlays for your television. By the late 1970s, game consoles had improved, but were still hamstrung by limitations. The Atari 2600 had no video RAM of its own, forcing designers to "race the beam" of a television set and draw graphics in a fraction of a second, before the electron beam in the television could make another split-second pass over the display. The Odyssey2 had an appallingly low 64 bytes of RAM... that's not gigs, or megs, or even kilobytes, but 64 bytes (characters), just enough memory to hold a small sentence. The Intellivision's internal clock speed can be measured in mere kilohertz rather than megahertz, resulting in the plodding pace of many of its games.
By the 1980s (especially after the disastrous Atari 2600 ports of Pac-Man and Donkey Kong), players demanded more power from their game consoles, to more accurately capture the elusive arcade experience. Console manufacturers addressed this need by taking the more robust hardware of 1970s personal computers, and turning this withered but still useful technology into next generation game consoles. The successor to the Atari 2600 was the Atari 5200, which used the hardware of the Atari 400 computer as a foundation for its design. Indeed, the two machines are so similar that Atari 400 games can be ported to the Atari 5200 with ease... and they continue to be to this day, to fill gaps in the latter system's anemic library. Atari would return to this well in the late 1980s for the XEGS game system, a console with the added power of the XE computer line and an optional keyboard. (If you called an XEGS "two Atari 5200s duct-taped together," you wouldn't be far off.)
The Atari 5200's more successful competitor, the ColecoVision, was made from off the shelf parts, which means it shares its DNA with a whole lot of 8-bit home computers and game consoles. Its closest cousins are Sega's SG-1000 and the Japanese MSX computer, with Britain's ZX Spectrum hanging on a nearby branch of the family tree. The Bit 2-in-1 Dina, a Taiwanese game console sold by video game mail order company Telegames, offers compatibility with both the ColecoVision and the SG-1000, thanks to the two machines' similar hardware. The MSX is moderately more powerful than the ColecoVision and thus not directly compatible with it, but its games can be run on a ColecoVision with tweaks to their code and the aid of a peripheral called the Super Game Module, designed by Eduardo Mello. The later Sega Master System contains the legacy hardware of these machines along with its own vastly improved graphics processor, and can be coaxed into running games for the MSX, the SG-1000, and even the ColecoVision.
Of course, just because a video game system contains the CPU of a home computer does not mean that the two systems are at all similar. The Sega Genesis has the 16-bit 68000 processor of the Commodore Amiga, but differs from that system in many ways, with less RAM, an entirely different graphics processor, and a sound chip built from the brains of the Sega Master System. Although there were a good many Amiga ports to the Sega Genesis (The Killing Game Show, Risky Woods, Shadow of the Beast, and Lemmings among others), they had to be rewritten from scratch, and were often lesser games than their Amiga counterparts.
On the Nintendo side of the fence, the Super NES contained the Ricoh 65C02, an 16-bit off-shoot of the CPU in the NES. One might think that this would make the Super NES cross-compatible with the Apple IIgs, but one would need to think again, as the Super Nintendo was packed with cutting-edge, custom-designed technology that elevated the system's graphics and sound past its competitors. By contrast, the Apple IIgs used the 65C02 to maintain compatibility with previous systems in the Apple II line. It was still a 16-bit computer that offered improvements over the dusty old Apple IIe you might have used in middle school, but despite using the same CPU, the Apple IIgs doesn't come anywhere near the quality of the Super NES as a games machine.
Future game consoles would also share a genetic link with home computers... the Xbox is essentially an x86 PC tailored for the video game experience, and the GameCube took the PowerPC architecture of late 1990s Macintosh computers and smashed it into a purple lunchbox. Then Nintendo took the same hardware and smashed it into two other shells, with extra RAM, extra cores, and extra... uh, waggle.
So if you were wondering when video game systems suddenly became home computers, the answer is is that it's always been this way. You just didn't notice.