{"id":462,"date":"2022-11-14T22:08:50","date_gmt":"2022-11-14T22:08:50","guid":{"rendered":"https:\/\/pc-keeper.tech\/index.php\/2022\/11\/14\/intels-fourth-graphics-attempt-ieee-computer-society\/"},"modified":"2022-11-14T22:08:50","modified_gmt":"2022-11-14T22:08:50","slug":"intels-fourth-graphics-attempt-ieee-computer-society","status":"publish","type":"post","link":"https:\/\/pc-keeper.tech\/index.php\/2022\/11\/14\/intels-fourth-graphics-attempt-ieee-computer-society\/","title":{"rendered":"Intel\u2019s Fourth Graphics Attempt | IEEE Computer Society"},"content":{"rendered":"<p> [ad_1]<br \/>\n<\/p>\n<div>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-312380 img-responsive alignright\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215745\/Chasing-Pixels-Larrabee.jpg\" alt=\"Chasing Pixels - Larrabee\" width=\"250\" height=\"250\" srcset=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215745\/Chasing-Pixels-Larrabee.jpg 250w, https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215745\/Chasing-Pixels-Larrabee-150x150.jpg 150w, https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215745\/Chasing-Pixels-Larrabee-100x100.jpg 100w\" sizes=\"auto, (max-width: 250px) 100vw, 250px\"\/>Intel surprised the industry in 2018 when word got out that the company would launch a discrete GPU product line. Intel had tried to enter the discrete graphics chip market before and, with the failure of the infamous i740 in 1999, vowed never again. But it was a socket Intel didn\u2019t have, so they gave it another go and launched the Larrabee project in 2007.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">After three years of bombast, Intel shocked the world by canceling Larrabee. Instead of launching the chip in the consumer market, Intel will make it available as a software development platform for both internal and external developers. Those developers can use it to develop software that can run in high-performance computers.<\/p>\n<p>\u201cLarrabee silicon and software development are behind where we hoped to be at this point in the project,\u201d said Intel spokesperson Nick Knupffe. \u201cAs a result, our first Larrabee product will not be launched as a standalone discrete graphics product.\u201d (December 4, 2009.)<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">How it began<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel launched the Larrabee project in 2005, code-named SMAC. Paul Otellini, Intel\u2019s CEO, hinted about the project in 2007 during his Intel Developer\u2019s Forum (IDF) keynote. Otellini said it would be a 2010 release and compete against AMD and Nvidia in the realm of high-end graphics.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel announced Larrabee in 2008 and in early August at Siggraph. Then at the Hot Chips conference in late August and at the IDF in mid-August. The company said Larrabee would have dozens of small, in-order x86 cores and run as many as 64 threads. The chip would be a coprocessor suitable for graphics processing or scientific computing. Intel said, at the time, programmers could, at any given time, decide how they would use those cores.<\/p>\n<p>\u00a0<\/p>\n<hr style=\"width: 100%;\"\/>\n<p>\u00a0<\/p>\n<p style=\"text-align: center; color: #ff6600;\"><strong>Want More Tech News? Subscribe to <i>ComputingEdge<\/i> Newsletter Today!<\/strong><\/p>\n<p>\u00a0<\/p>\n<hr style=\"width: 100%;\"\/>\n<p>\u00a0<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">By 2007, the industry was pretty much aware of Larrabee, although details were scarce and only came dribbling out from various Intel events around the world. In August of 2007 it was known that the product would be x86, capable of performing graphics functions like a GPU, but not a \u201cGPU\u201d in the sense that we know them and was expected to show up some time in 2010, probably 2H\u201910.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Speculation about the device, in particular how many cores it would have, entertained the industry, and maybe employed a few pundits, and several stories appeared to add to the confusion. Our favorite depiction of the device was the blind men feeling the elephant \u2013 everyone (outside Intel) claimed to know exactly what it was \u2013 and no one knew what it was beyond the tiny bits they were told.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">What we did know was that it would be a many-core product (\u201cMany\u201d means something over 16 that year), would have a ring-communications system, hardware texture processors (a concession to the advantages of an ASIC), and a really big coherent cache. But most importantly it would be a bunch of familiar x86 processors, and those processors would have world-class floating point capabilities with 512 bit vector units and quad threading.<\/p>\n<figure id=\"attachment_312381\" aria-describedby=\"caption-attachment-312381\" style=\"width: 300px\" class=\"wp-caption alignleft\"><img decoding=\"async\" loading=\"lazy\" class=\"size-medium wp-image-312381 img-responsive\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215951\/Chasing-Pixels-Larrabee-block-diagram-300x121.png\" alt=\"Chasing Pixels Larrabee block diagram\" width=\"300\" height=\"121\" srcset=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215951\/Chasing-Pixels-Larrabee-block-diagram-300x121.png 300w, https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14215951\/Chasing-Pixels-Larrabee-block-diagram.png 480w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\"\/><figcaption id=\"caption-attachment-312381\" class=\"wp-caption-text\">Larrabee block diagram (Source: Intel)<\/figcaption><\/figure>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">The diagram was symbolic of the number of CPU cores and the number and type of coprocessors and I\/O blocks, which were implementation-dependent, as were the locations of the CPU and non-CPU blocks on the chip. Each core could access a subset of a coherent L2 cache to provide high bandwidth access and simplify data sharing and synchronization.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Larrabee\u2019s programmability offered support for traditional graphics APIs such as DirectX and OpenGL via tile-based deferred rendering that ran as software layers. Intel ran the renderers using a tile-based deferred rendering approach. Tile-based deferred rendering can be very bandwidth-efficient, but it presented some compatibility problems at that time\u2014the PCs of the day were not using tiling.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Larrabee CPU core and associated system blocks. The CPU was a Pentium processor in-order design, plus 64-bit instructions, multithreading, and a wide vector processor unit (VPU).<\/p>\n<figure id=\"attachment_312382\" aria-describedby=\"caption-attachment-312382\" style=\"width: 144px\" class=\"wp-caption alignright\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-312382 img-responsive\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220046\/Larabeee-CPU-core.png\" alt=\"Larabeee CPU core\" width=\"144\" height=\"297\"\/><figcaption id=\"caption-attachment-312382\" class=\"wp-caption-text\">Larrabee CPU core and associated system blocks. The CPU was a Pentium processor in-order design, plus 64-bit instructions, multithreading, and a wide vector processor unit (VPU). (Source: Jon Peddie Research<\/figcaption><\/figure>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Each core had fast access to its 256 kB local subset of the coherent second-level cache. The L1 cache sizes were 32 kB for I-cache and 32 kB for D-cache. Ring network accesses passed through the L2 cache for coherency. Intel manufactured the Knights Ferry chip in its 45 nm high-performance process and the Knights Corner chip in 22 nm.<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">Who asked for Larrabee anyway?<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Troubled by the GPU\u2019s gain in the share of the silicon budget in PCs, which came at the expense of the CPU, Intel sought to counter the GPU with its own GPU-like offering.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">There has never been anything like it and as such there were barriers to break and frontiers to cross, but it was all Intel\u2019s thing. It was a project. Not a product. Not a commitment, not a contract, it was and is, an in-house project. Intel didn\u2019t owe anybody anything. Sure, they bragged about how great it was going to be, and maybe made some ill-advised claims about performance or schedules, but so what? It was for all intents purposes an Intel project \u2013 a test bed, some might say a paper tiger, but the demonstration of silicon would make that a hard position to support.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">So last week most of the program managers had tossed their red flags on the floor and said it\u2019s over. We can\u2019t do what we want to do in the time frame we want to do it. And, the unspoken subtext was, we\u2019re not going to allow another i740 to ever come out of Intel ever again.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Now the bean counters took over. What was\/is the business opportunity for Larrabee? What is the ROI? Why are we doing this? What happens if we don\u2019t? A note about companies \u2013 this kind of brutal, kill your darling\u2019s examination is the real strength of a company.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Larrabee as it had been positioned to date would die. It would not meet Intel\u2019s criteria for price-performance-power in the window it was projected \u2013 it still needed work \u2013 like Edward Scissorhands -it wasn\u2019t finished yet. However, the company has stated that it still plans to launch a many-core discrete graphics product but won\u2019t be saying anything about that future product until sometime next year.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">And so it was stopped. Not killed \u2013 stopped.<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">Act II<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel invested a lot in Larrabee in dollars, time, reputation, dreams and ambitions, and exploration. None of that is lost. It doesn\u2019t vanish. Rather, that work provides the building blocks for the next phase. Intel has not changed its investment commitment on Larrabee. No one has been fired, transferred or time shared. In fact there are still open reqs.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel has built and learned a lot. More maybe than they originally anticipated. Larrabee is an interesting architecture. It has a serious potential and opportunity as a significant co-processor in HPC, and we believe Intel will pursue that opportunity. They call it, \u201cthroughput computing.\u201d We call \u201cthroughput computing\u201d a YAIT; yet another Intel term.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">So the threat of Larrabee to the GPU suppliers shifts from the graphics arena to the HPC arena \u2013 more comfortable territory for Intel.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel used the Larrabee chip for its Knights series MIC (many integrated core) coprocessors. Former Larrabee team member Tom Forsyth said, \u201cThey were the exact same chip on very nearly the exact same board. As I recall, the only physical difference was that one of them did not have a DVI connector soldered onto it.\u201d<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Knights Ferry had a die size of 684 mm\u00b2 and a transistor count of 2,300 million\u2014a large chip. It had 256 shading units, 32 texture mapping units, and 4 ROPS, and it supported DirectX 11.1. For GPU compute applications, it was compatible with OpenCL version 1.2.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">The cores had a 512-bit vector processing unit, able to process 16 single-precision, floating-point numbers simultaneously. Larrabee was different from the conventional GPUs of the day. Larrabee used the x86 instruction set with specific extensions. It had cache coherency across all its cores. It performed tasks like z-buffering, clipping, and blending in software using a tile-based rendering approach (refer to the simplified DirectX pipeline diagram in Book two). Knights Ferry, aka Larrabee 1, was mainly an engineering sample, but a few went out as developer devices. Knights Ferry D-step, aka Larrabee 1.5, looked like it could be a proper development vehicle. The Intel team had lots of discussions about whether to sell it and, in the end, decided not to. Finally, Knights Corner, aka Larrabee 2, was sold as XeonPhi.<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">Next?<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">That was not the end of Larrabee as a graphics processor \u2013 it was a pause. If you build GPUs enjoy your summer vacation, the lessons will begin again.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Or will they?<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Maybe the question should be \u2013 why should they?<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Remember how we got started \u2013 one of the issues was the gain in silicon budget in PCs by the GPU at the expense of the CPU. There are multiple parameters on that including:<\/p>\n<ul style=\"padding-left: 5%; color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em; list-style-image: url('https:\/\/ieeecs-media.computer.org\/wp-media\/2021\/11\/17161248\/Icon_Right-Double-Arrow.png');\">\n<li>Revenue share of the OEM\u2019s silicon budget<\/li>\n<li>Unit share on CPU-GPU shipments<\/li>\n<li>Mind share of investors and consumers<\/li>\n<li>Subsequent share price on all of the above.<\/li>\n<\/ul>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">The discrete GPU unit shipments had a low growth rate. ATI and Nvidia hope to offset that with GPU compute sales; however, those markets will be slower to grow than have been the gaming and mainstream graphics markets of the past ten years.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Why would any company invest millions of dollars to be the third supplier in a flat to low growth market? One answer is the ASP and margins are very good. It\u2019s better for the bottom line, and hence the PE to sell a few really high valued parts than a zillion low margin parts.<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">Why bother with discrete?<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">GPU functionality is going in with the CPU. It\u2019s a natural progression of integration \u2013 putting more functionality in the same package. The following year we saw the first implementations. They were mainstream in terms of their performance and were not serious competition in terms of performance to the discrete GPUs, but they further eroded the low end of the discrete realm. Just as IGPs stole the value segment from discrete, embedded graphics in the CPU took away the value and mainstream segments, and even encroach on the lower segments of the performance range.<\/p>\n<figure id=\"attachment_312383\" aria-describedby=\"caption-attachment-312383\" style=\"width: 300px\" class=\"wp-caption alignright\"><img decoding=\"async\" loading=\"lazy\" class=\"size-medium wp-image-312383 img-responsive\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220151\/graphics-chip-shipments-during-the-recession-300x225.png\" alt=\"graphics chip shipments during the recession\" width=\"300\" height=\"225\" srcset=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220151\/graphics-chip-shipments-during-the-recession-300x225.png 300w, https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220151\/graphics-chip-shipments-during-the-recession.png 480w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\"\/><figcaption id=\"caption-attachment-312383\" class=\"wp-caption-text\">Graphics Chip Shipments during the 07-098 recession (SOURCE: Jon Peddie Research)<\/figcaption><\/figure>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">That meant the unit market share of discrete GPUs would decline further. That being the case, what is the argument for investing in that market?<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">Conclusion<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel made a hard decision and a correct one at the time. Larrabee silicon was pretty much proven, and the demonstration at SC09 of measured performance hitting 1 TFLOPS (albeit with a little clock tweaking) on a SGEMM Performance test (4K by 4K Matrix Multiply) was impressive. Interestingly it was a computer measurement, not a graphics measurement. Maybe the die had been cast then (no pun intended) about Larrabee\u2019s future.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Intel\u2019s next move was to make Larrabee available as an HPC SKU software development platform for both internal and external developers. Those developers could use it to create software that could run in high-performance computers.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">That left the door open for Intel to take a second run at the graphics processor market. The nexus of compute and visualization, something we discussed at Siggraph was clearly upon us, and it\u2019s too big and too important for Intel not to participate in all aspects of it.<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">Although Intel disparaged the GPU every chance it got, it always wanted one and tried a few times before Larrabee (e.g., 82768, i860, i740). In 2018, it kicked off another GPU project code -Xe.<\/p>\n<h2 style=\"color: #002855; font-size: 24px; font-family: Montserrat; font-weight: 500; line-height: 29px;\">Epilog<\/h2>\n<hr style=\"text-align: left; width: 30%; height: 3px; color: #ffa300; background-color: #ffa300; border: none;\"\/>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">In early November 2022, u\/Fuzzy posted at LinusTechTips that he\/she got a recently acquired Intel Larrabee test AIB and managed to boot it on Windows 10. Got the Intel Larrabee Working : LinusTechTips.<\/p>\n<p>Source: LinusTechTips<\/p>\n<p style=\"color: #454545; font-size: 18px; font-family: Open Sans; font-weight: 400; line-height: 1.7em;\">And if you\u2019re interested in the complete history and future trends of GPUs, check out Dr. Peddie\u2019s three-book series coming out this month.<\/p>\n<p><img decoding=\"async\" loading=\"lazy\" class=\"img-responsive alignnone wp-image-312384\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220410\/history-of-the-gpu-202x300.png\" alt=\"history of the gpu\" width=\"96\" height=\"142\" srcset=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220410\/history-of-the-gpu-202x300.png 202w, https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220410\/history-of-the-gpu.png 222w\" sizes=\"auto, (max-width: 96px) 100vw, 96px\"\/>\u00a0 \u00a0 \u00a0<img decoding=\"async\" loading=\"lazy\" class=\"img-responsive alignnone wp-image-312385\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220459\/New-Developments-197x300.jpg\" alt=\"New Developments\" width=\"96\" height=\"146\" srcset=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220459\/New-Developments-197x300.jpg 197w, https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220459\/New-Developments.jpg 217w\" sizes=\"auto, (max-width: 96px) 100vw, 96px\"\/>\u00a0 <img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-312386 img-responsive\" src=\"https:\/\/ieeecs-media.computer.org\/wp-media\/2022\/11\/14220522\/history-of-the-gpu-part-2.jpg\" alt=\"history of the gpu part 2\" width=\"96\" height=\"146\"\/><\/p>\n<\/p><\/div>\n<p>[ad_2]<br \/>\n<br \/><a href=\"https:\/\/www.computer.org\/publications\/tech-news\/chasing-pixels\/intels-fourth-graphics-attempt\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[ad_1] Intel surprised the industry in 2018 when word got out that the company would launch a discrete GPU product&hellip;<\/p>\n","protected":false},"author":1,"featured_media":463,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[399,358,280,360,2,348],"tags":[],"class_list":["post-462","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-chasing-pixels","category-gpus","category-microchips","category-processors","category-tech-news-post","category-vlsi"],"_links":{"self":[{"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/posts\/462","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/comments?post=462"}],"version-history":[{"count":0,"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/posts\/462\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/media\/463"}],"wp:attachment":[{"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/media?parent=462"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/categories?post=462"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/pc-keeper.tech\/index.php\/wp-json\/wp\/v2\/tags?post=462"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}