10 Years of Being a PC Gamer (Part 4): The long wait for 16nm…

Last Updated on

Views: 308 , Video Rating: 5.00 , View Time: 40:44 Minutes, # of Likes: 18, # of Disslikes: 0

This is the fourth video in a series about what it was like to watch the landscape in PC Hardware change over the course of a decade.

In 2014 28nm was getting old…and yet it would be 2 more years before we got GPU’s on a new node. Meanwhile Intel was also stuck on 22nm, and AMD’s CPU releases were a combination of sad and revolutionary. This was an era of survival for AMD, supremacy for Nvidia, and stagnation for Intel.


Direct One-Time Donations:
Cash App Donations:

LINKS!!!! (I encourage people to check out the comments sections from the past):

so this is part four of ten years of being a PC gamer you know the series that I've been making about what it was like at least from my point of view to watch the mostly gaming PC hardware space change over the past ten years you know past 10 years you know before 2019 so coming off hot from 2013 into early 2014 I talked about Kepler GCN how Intel had cemented their dominance in the CPU space and this is probably gonna be the longest video I thought about it a lot back and forth on how to do part 4 should I make it you know because if you look back on my series the first parts the first three years but I was a noob to PC gaming so the first few years were effectively the same era and 2012 and 2013 each of those years really distinctly stands out to me because I learned so much in just each one of those years but in my opinion 2014 through 2016 was really just one long wait to get the hell out of 28 nanometer everyone really expected 20 nanometer to come by 2015 at the latest and and it didn't come until 2016 and it wasn't even 20 nanometer was skipped to go to 14 and 16 nanometer you know this was an era where I mean really think about it right AMD was dominating in performance and efficiency during my Part 1 time period despite me not really knowing much about them back thinks I was such a noob and they were able to hold on to maybe a little over 50% of the graphics market share in their cpu market share you know even before that era you know and the early FX era where they were destroying Intel you know they were getting like 25-30 percent of the market you know so this this is an era where you have to ask yourself so if AMD can only hold you know half of a market when they're dominating and about a third in the market when they're only slightly better with the 7000 series in GCM what happens when Nvidia rofl stomps them what happens when Intel's fooling the lady oh this is really the dark era for AMD this is when Nvidia really took the lead this is the precursor to where we are now but before I get into graphics I feel like this series has been very graphics heavy I would like to actually go back and do almost a recap on what's going on in the CPU space because this is actually when things kind of did start to get interesting again so let's see so basically by this point Intel wasn't releasing anything in 2014 and put yourself in my shoes as someone who got into this space 5 years before 2014 right I got to see the Core 2 Duo nee helm core Series III series versus the Phenom 2 series you know these were those were fairly interesting times relatively speaking and even then you had Sandy Bridge and before we found out bulldozers sucked I mean there was still a lot of excitement that it could be good and then Ivybridge came out yes it was only 5 to 10 percent stronger at best than Sandy Bridge but it used like two-thirds or half the energy was a big deal for laptops and even has well at least got you you know a little bit better efficiency in 10 percent better performance but going into 2014 Intel was releasing basically nothing and this seems not abnormal to someone who probably built their PC in 2016 I think a lot of people did build their pcs in 2016 there's one of those years and this was when you know remember skylake comes out and then kb lake and then coffee like and then you know and it just keeps on going on and on and on with these lakes but that wasn't the norm so the 4790k came out mid 2014 and this is the first time we really saw Intel within like a year later of releasing their previous generation it was just being basically a 5% higher clocked version of their previous chip and you know it had a little bit better thermal paste which was nice and it had you know higher clocks at 4.7 gigahertz is when you started to get close to 5 gigahertz after overclocking but it just wasn't really that interesting and Intel continued to flounder I mean they were supposed to be on 14 nanometer of the year before and said they did this and then they you know just so they could say they release that they had core m-series which is basically the five to ten watt series launched with Broadwell which was efficient but you know sneaking that in at the end of the year a year late did not seem well so this is when it started to become clear something was wrong with Intel upgrading nodes and this continued into 2015 with Broadwell launching late on desktop and not getting any real attention basically being an almost not quite but almost laptop only launch and then leading into sky like and sky like was good but again it was basically Haswell 2.0 maybe a little better than how we perceive as well but I think it's just cuz it was such a long wait it's about ten percent stronger again you know great at gaming but it took years to get another ten to twenty percent performance boost and prices weren't going down they're going up with every generation meanwhile at AMD we had some pretty desperate attempts to stay competitive they had been having slight updates to their piledriver an APU lineup I mean we got in late 2014 the ninety nine nine five nine D FX piledriver CPU and they had a one thousand dollar eight core that clocked at five gigahertz taka this is the first five gigahertz desktop well first five gigahertz saleable CPU ever which was a milestone you know but I mean look at this right we're talking about something that in some games is losing to a 50 watt I 351 I three beating a 8 gigahertz eight core even if you count the modules is only one core each this is pathetic right I mean it doesn't matter how you dice it here this just was not for gaming and even in productivity tasks yeah it just wasn't looking good they got this he four so they could say they had something for gamers you know you had this marketing push here from AMD it really you know within a year the price dropped to a fourth of its price and this became oh and it came with a water-cooler became abundantly clear one of the main read this to catch headlines and another reason they launched this chip was so they could sell it with the 290x it was kind of when things started to get a little dark for AMD like really dark like where it became like what is going on are we gonna have any competition ever again here in this space I mean look at this they got to $30 this ridiculous 220 watt they just want to be able to say you can buy a high-end AMD CPU with their high in 290x but the fact is by now everyone would realize Radeon could compete with Nvidia but their CPUs were woefully behind Wolff ly behind really this is the period of time where the Radeon part of the business was propping up the CPU part not the other way around you know AMD CPU business is the most important part but if it wasn't for their graphics business I'm not sure what would have happened to him being this time period their stock price went down to $5 that's almost a fifth of where it is now and 10 years prior it was you know 10 times higher this was bad times for AMD's CPUs having said that it was interesting though right it was pretty interesting I think that they launched an 8 core at 5 gigahertz and it came with flashy marketing and then they also started launch other interesting stuff like this the f-150 350 this was basically a this is the same architecture actually as the CPU and the ps4 and Xbox one ps4 Xbox one having I believe it's eight cores but one core is disabled and they keep one for background tasks so for gaming purposes you're looking at like 1.6 1.7 gigahertz six cores for the consoles which is enough right it's all if you program to the metal it's almost the equivalent of like an i3 for gaming which is more than enough for 30 FPS gaming and using GPGPU the consoles could make up for it a little bit especially the ps4 but this was an interesting product here because this CPU is like 50 bucks and this is only for the top-end model if you got there like 1.3 1.5 gigahertz quad-core it was like $30 for a quad-core and if you weren't doing high-end gaming this was one a hell of a budget PC CPU and there was even a motherboard where you could overclock it to about 2.5 gigahertz meaning you had a CPU that was you know so if you're thinking 2.5 gigahertz quad-core versus 1.6 gigahertz six core this is a little better than what's in the consoles you can get this for like $50 and then you can get the motherboard for $35 it only needed a single channel of memory you know we're talking about a system here and it only uses 25 Watts right this was very interesting time I actually used one of these motherboards this one actually I think was literally this motherboard from gigabyte to make a super budget mining rig when I was poor in college because this motherboard here $30 and then I could get a CPU for about 20 bucks put PCI risers and have three 6800 series cards mining – you know it worked and if you were a budget gamer you could get like a used six you know like seven I don't know right like maybe a used I would say seven seven seventy or something chuck that into here there were a lot of builds showing a 750ti with this thing you know this was an interesting time for budget builds and and the integrated graphics were actually good enough to play pretty much any game made before 2013 so you want to you know you want a boot up battlefield 3 or oblivion or something this is a 25 watt 100 $200 PC that can play games interesting times and you also have the launch of AMD's Kivar e here and this is when AMD is really cementing their budget apu builds you know this is a 100 watt apu that can play games in 720p or even you know 720p at Ultra or medium settings and 1080p on low to medium settings you know this was a decent set up here you could really game on these ap use and what was interesting is all these launches if you really think about it right from someone in my point of view who watched bulldozer get destroyed in 2012 and then piledriver improved but then still get destroyed by house while in Ivy Bridge you know how it was is FX crap after FX crap and then in 2014 we got this really cool Kabini you know low-power APU series and then this really cool Kaveri budget platformer with a good APU AMD was releasing new CPUs again and they couldn't defeat Intel but they could definitely compete in areas where Intel was unwilling to compete and this is when you started to see just almost a new swagger in AMD you know Lisa sue took over as CEO 2014 not a coincidence this is when they had a lot of new cpu product launches going on that weren't great but were actually marketed and marketed well and it filled parts of the market that Intel is unwilling to this is when you started to see that AMD wasn't going to give up and this is what I really think of AMD's Battle of Dunkirk right they started to lose in a lot of different spaces but they just kept releasing stuff it didn't matter the FX 95 90 was kind of shitty compared to the really shitty compared to the competition it didn't matter if their AP use couldn't compete on CPU performance they had good apu overall performance and they were they had a stranglehold on the console market and you know what if Nvidia made more graphics cards that were more efficient they would lower their prices and stay in the game this is when AMD just stayed in the game right and this is also the point in time that I just want to bring this up where I thought that anyone building an ATX build was silly crossfire and SLI were almost dead by now by late 2014 this is where you really started to see you know maybe the CPUs were stagnating you know with has well and skylake but you could get some really high-end ITX motherboards put a Titan or 290 X in there and you weren't compromising any performance effectively now this is a built I did in a late 2016 but this was these cases were since 2015 and this was something where you could have a PC that's smaller than the Xbox one and has a Titan neither even like a maximal Titan or something in it this is when there was no reason I continue to say this unless you have thread Ripper or some kind of intel h EDT setup or some reason you need a large motherboard to fit a bunch of things you really don't need bigger than micro ATX and you really probably don't need anything better than i tec so the ITX you're probably spending 10% more to shrink everything down anyway so that's what was going on with CPU is everything was getting smaller but performance just really wasn't increasing AMD was staying in the game but nothing they were doing was that impressive so in 2014 in the GPU space AMD continued to do that fine wine thing of improving on how they optimized GCN more and more and more and more you know and it really got ridiculous by mid 2014 I mean look at this chart here this just random game I pulled up literally the first one I clicked on in a review this is in February you have Max Payne 3 very popular game by now the 7/8 50 is stronger than the 660 which is not meant to be you have the 7 the 660ti the 760 just getting destroyed by basically everything in AMD's lineup here I mean oh this is it was really confusing to me like I said in the last video I anyone was buying kepler video cards at this point in time but at the end of the day in a video continued to hold on to most of the market they had about 60 percent of the market share I believe back then and this is where you really have to ask yourself like I said in the intro so if AMD is dominating in technology but not really executing on everything they could be and this is why they only hold a little over 50% of the market with like the 4,000 5,000 series when they're really just raffle stomping to Vidya you know they only get half the market when they're awful stomping when they're ahead with GCN they only get about 40% of the market what do you think happens when Nvidia actually figures out what they're doing right remember late at the end of 2011 28 nanometer GCN came out you know and then all 2012 and 13 was 28 nanometer by now everyone's saying wins 20 Anat nanometer was 22 nanometer coming out and in 2014 it kinda started to become clear that this just wasn't happening anytime soon that we might have three generations of 28 nanometer from both companies and Nvidia realized this I think slightly earlier than AMD and if you look at their roadmap they didn't have a pascal they just had max 1 it was meant for 20 nanometer but they quickly retooled it for 28 nanometer because they didn't want to waste any time here you know and so what we got was the first Maxwell card in early 2014 and was the 750ti and this was the canary in the coal mine this is a card that came pretty close to a 7 8 50 it was overall a solid maybe 10% weaker overall 20% weaker but he used half the energy it was a tiny graphics card usually not needing a PCI connector I mean look at this here it's right up there at the seven eight 50 nut that was stronger than the 650 Ti all the time but this thing was impressive and I remember it specifically because it was very interesting you had this graphics card that really started to compete in mining which is anyone who mined in the 2012 through 2014 era knows how crazy that was I mean the video cards were in utter joke and mining it was to the point where therapy cards are almost better than Kepler and some coins for mining and that's because they just didn't have any computer bandwidth compared to AMD and that's what you need for when you're doing computationally intensive things you need a lot of core power and a lot of bandwidth to feed those cores to do the calculations and so when you have a 750ti here being worth mining with because effectively use the amount of energy as 260 X or even a little less but it mind almost as well as a 7 850 you know that gives you a little pause that might tell you this thing could destroy everything in gaming if they scaled it out and the thing is that Divya did scale it up and nothing AMD was doing was really looking like it was gonna prevent them from getting dominated near the end of the year I mean they had mantle which was interesting had a lot of hype around it I remember it being kind of worthless the first year came out but in late 2014 they fixed the bugs and it was you know maybe a 10% performance boost and a little smoother in battlefield 4 we can think it for Vulcan but it just wasn't enough and again this is that video I had where why does AMD allow Jensen hang and NVIDIA to act like Edison why does AMD come up with all these great ideas and then just not execute on them and then when NVIDIA comes up with a great idea man do they hit it with a few amber over and over they really make sure you know when they win and they really made sure you knew they won when Maxwell came out this was bad really bad I remember seeing this and I mean I'll just go here let's zoom him let's see where do I even start on here at this review you know it beat the 290x and let's see personally I wouldn't upgrade from anything more recent than the 680 but users of older cards should definitely look into videos new products oh and AMD seems a lot of people accuse techpowerup of being a bit of a Nvidia shell and yeah there seems to be something going on there with them having a bias but their numbers are pretty accurate and there and this is my point those you had reviewers literally going AMD's and they were let's see what happened here the 980 comes out I remember at first not really being that impressed I saw it is like a half upgrade right I mean it was stronger than the 780ti it was stronger than the 290x you know if I go to check power up and go to the performance summary we can easily see here you know in 4k what I mean what's the big deal it's only like 11% stronger than the 290x right who cares and it's like the same price they're launching a year later with something you know well the problem is this thing use like a hundred and fifty watts this is one of the most efficient cards ever released especially around time it came out you know this was an utter bloodbath here in that respect and I mean so think of it it was beating the 290x by 11% and it was using as much energy as well less than every card even on this chart going all the way back to basically a 285 or not yeah around an r9 285 like that's how far back we have to go to talk about something that competes with an empower usage and it was like 50% stronger than that easily so that was bad that was really bad and Andy just had nothing to compete they drop prices and that was good but it just wasn't enough in a video you know and this is what I learned people it's not just about price performance or top performance people want to feel like they have the cutting edge and Maxwell made you feel that way this was the most efficient architecture and the most powerful architecture ever released it didn't matter right that the 290x was only let's say 10 20 percent behind and they were selling it for $400 at one point you know they all thirty forty percent less money for only 20% weaker it was it was just perceived as less advanced and people were bought at Maxwell in droves and I remember this being the first time where I just had to admit that Radeon was behind and this was very weird again I come from an era right when I got into PC gaming I saw the 4000 through 6000 series where AMD was making video look like a joke and they actually had most of the market where I was basically just doing throw downs with NVIDIA fanboys going why are you buying an RX 480 it cost twice as much is a5 870 is the same performance and uses over double the energy like the 5 970 use less energy say double graphics card that use less energy than Nvidia single graphics card and cost the same amount of money it'd be like if the GTX 690 use less energy than a 7-9 and was that much more powerful and yet you know people would defend it but I got nothing here this is when you just had to admit in a video is in the lead this wasn't a video Sandy Bridge moment and I remember a friend building a PC and the 970 came out for three hundred and thirty dollars which for being honest using their no Moe Norman clay chair the 970 was really a 960 TI or a 960 but because their new architecture was so good they just called it a 970 and sold it for 10% more money but it was what it was I remember telling a friend he's like should I get a 270x or a 660ti and I was like a 970 he's like well what if I want to save money a 970 what if I want to spend more 970 his price performance was king this was 330 and this was $400 and it used half the energy is half the energy and then the 3.5 gigabyte Fiasco came and this is such a bizarre time for me you know this is where you saw people like look at this frame time comparison the you know the 980 is like 25 percent stronger than the 970 25 percent stronger look at these frame times they're atrocious on the 970 once you use a more video memory and I remember Nvidia defending it saying oh no there's just two segments of RAM I mean eventually they lost they had a lawsuit it was really a 3.5 gigabyte card they sold for as a 4 gigabyte card and in fact it also had part of its ro Peas disabled as well so they were just straight up lying about the specs hoping no one would notice and this is one of those times where I started to become very conflicted because Nvidia had better products here but did they because then I found out they had less RAM in our Opie's and within six months after launch you had certain games in the nine 70s performance was dropping like a rock and this is where I just this mistrust started to evolve at least for me and many people with NVIDIA this is that video I had AMD versus Nvidia fanboys where you you know AMD fanboys become dogmatic and call you evil for buying nvidia i've never let it get like that but I get it you know every time Nvidia hits a home run there's something weird that happens that makes you just not trust them whether it's GPP wither it's their washed out colors with memory compression whether it's this 3.5 gigabyte it just always seems like nvidia is finding Ziff in a video is ever in charge they will find just a little way to you over without you hopefully hoping that you don't notice but anyways I digress so then they released the Titan X which was you know a good let's say 40 percent 35 percent stronger than the 980 you know putting it all the way up here oh yeah signed by the 295×2 made the Titan Z like a complete joke I'm not gonna go into that because rarely do people buy 1000 or $1500 dual GPUs but everyone should know that in a video to try to launch a Titan Z for $3,000 so the groundwork was there hey NVIDIA wants to charge $3,000 for tie-ins and now they are now they are they were just laughed out of the room before because AMD was able to launch something stronger for half the price but anybody can't do that anymore so now Titans are $3,000 and this is when Titans were a thousand dollars twelve gigabytes for the Titan X and we started to hear rumors of the fury acts and everyone was hoping this would be another 290x situation everyone was really hoping so but it wasn't what we got was basically in 4k you have the fury X tying the 980ti well using the same amount of energy really and but then at any lower resolution the 980ti pulled ahead by a little bit effectively they were very similar cards but it became this thing of like the fury X came out seven months later right so it's like if you're coming out later and you only match performance you typically expect the price to be lower and the reason this happened is AMD was planning to only compete with the Titan X they thought the video would keep the 980 at 550 and the Titan X at a thousand then they could slide and right at 750 dollars with the cool a very compact water-cooled graphics card that almost matched the Titan X but was $250 cheaper and was better overall than the 980 but Nvidia caught word of this release a slightly cut down 980ti which is what they've been doing ever since and AMD was just forced to sell the fury ax with tight margins and this is the start also of AMD selling their top-end cards with like only a 10 percent profit margin because this if that right if that because that's what they have to do to stay in the game this is again AMD's Battle of Dunkirk as you saw my thumbnail this is where they were just releasing what they could to stay in the game it was very depressing to watch this not only happened from their GPUs I mean their CPUs but start to happen in their GPUs but overall the theory series was an interesting one very interesting actually if you under volted them you can make them insanely efficient also the r9 nano was an incredibly impressive card that for more money than a 980 beat the 980 by about 10% in performance while using about 10% less energy it was an incredibly impressive card and you know technically AMD had the most efficient card again even in this era it's worth mentioning the fury nano was more efficient than any maxwell card but it costs more to make and so they priced themselves they really probably should have been cut down the same performance as a 980 and sold for about 500 bucks but you know the fury and the fury nano were received well the fury acts was an interesting card for ITX builds and people who wanted a silent pc they were priced competitively but they were losing market share things were not looking good and this was really the start of the slow bleed bleed and I think it's worth pointing out something here a lot of the reason AMD started to lose market share in my opinion and a lot of people's opinion wasn't just there not being able to take the top crown anymore nor in a videos final you know sledgehammer a performance winning with Maxwell it was also the fact that they just kept rebranding rebreathe this is where the meme rebrand Eon comes from and this is where you saw the 7 970 become the 280 X and then the 285 X I mean then the 380 X was effectively a more efficient 7 970 I mean it was just in the 7 857 850 and then the 270 and then the 365 I mean oh my god AMD the 290 became the 390 which was better you know 10% stronger with a little less energy usage and double the RAM it was a good card in fact one of my first card I bought new in years was the 390x I had a good job engineering job for the first time at a big major engineering company I just spoiled myself I couldn't see myself getting the theory because it was only about 20% stronger than the 390x and had half the RAM so I got a 390 X and this is when I learned how amazing aftermarket coolers were in the 2015/2016 era I mean they're absolutely amazing how silent they were the fans would turn off I love that but I mean look you've got a 390 X which is a rebrand competing with the 980 which is using 2/3 the energy it at most almost half the energy it just didn't look good and I think people want to buy the newest technology that's why MD has to get through their heads that they can't rebrand more than once and if they do rebrand it has to actually come with some new features you know at least the 580 had some newer features and was a little more efficient overall but you know I think moving forward in the future aimed he just needs to stop rebranding people want to feel like they're buying the newest technology doesn't matter if it's only a little better at a certain price segment when Nvidia came out with Maxwell they had a new lineup they came out with Pasco they had a new lineup and AMD just wasn't doing that so that was basically 2015 you know AMD stock price was hitting all-time lows things were not looking good and this led right into 2016 and let's move right on I won't even skip videos here so 2016 lots of rumors if people won't believe the 20 in early 2016 we were already talking about Polaris and Vega people weren't sure if they were the same architecture or if one was a code name for a segment of the Polaris architecture like you know they guy was a star and Polaris was the series no one was really sure and this was just oh my god the the I will call this the silicon rumor website boom of W CCF Fudd's illa just all of these rumors around 14 and 16 nanometer chips happening because people were desperate at least HBM was new at least maxwell was a completely new architecture that was revolutionary in efficiency but oh my god you have to remember there were some people that just would not buy a new GPU and 2015 or 2014 even if it was a lot stronger because they said no I have a 7 970 I paid $500 I have a 680 I paid $500 this is 28 nanometer I bought you know like a five eight seventy on 40 nanometer and then I got a seven 970 on 28 nanometer I'm not buying any new cards until I get a new note and so there were a lot of people waiting and lots of rumors about what was gonna happen here and so these series sold like hotcakes because there was just a complete buildup of people wanting to buy new products you know all these rumors about what was going on with you know all the way into or like a look at already talking about the 1080 TI what that would be and then we have the 1080 and what I think is interesting about the 1080 in just Pascal in general is that Pascal clearly performed better than everyone expected it was maxwell 2.0 so no one really expected that much from it but just because it was maxwell 2.0 doesn't mean it can't over perform it added some mitigation features to get rid of make asynchronous compute not hurt it anymore like back them maxwell really suffered and compute intensive games pascal didn't benefit from FB 16 or asynchronous compute but it could do faked FB 16 an asynchronous like kind of diversion so that it at least didn't lose performance in Doom right and Pascoe was built to have the highest clocks possible and so man did it over perform you know we got a fifty sixty percent increase in performance per segment and this is an aimed in Nvidia knew just what to do Nvidia knew to milk it and so this is why I've always said that I was not surprised with our TX's pricing look look mate 2016 $700 for an 80 come on people why is anyone surprised at how expensive the 2080 is they were doing this they were doing this in 2016 the r-tx pricing is not new all of the holiday season late 2016 this goddamn card was going for eight hundred dollars on new week eight hundred the founders edition was sick seven hundred and back then their founders Edition coolers were shitty unlike now where they're okay and so all a IB models were going for about 750 which is the same as now this is the start of milking but people wanted 60 nanometers so bad that I just think no one paid attention to it so I don't want to hear any complaints about new videos pricing with r-tx because they were doing this two years ago well three years ago again people what the hell are you complaining about so you know this was called the mad king of GPUs by techspot let's just take a quick look at its performance here I mean yeah you know this there's 50% stronger than anything AMD had use less energy and it dominated you could game in 4k with it you have to remember g-sync monitors were out then so they were marketing this as fine for 4k gaming which compared to the shitty performance you had before you know think of this right here as a 290x basically he had him to turn down a few settings to get 40 frames a second and 4k well now you could get 50 turned down a few a few settings and get 16 you know so it sold like hotcakes and then also polaris came out and 480 underperformed I would say I think everyone was hoping the 480 would be about 20% stronger so it go right about here everyone was hoping to be just behind the 980ti however I think everyone also expected it to be about $300 or more and the 4 gigabyte version of the 480 was $200 and it sold like hotcakes again because people were just waiting it didn't matter if it underperformed a little bit the fact is it was $200 for 290x like performance and it only used 150 watts this thing sold like hotcakes a lot of it was to minors to in fact Huaraz sold so well and I distinctly remember this you know I knew it was the 470 was gonna be very close to the 480 because typically AMD segments cards into families you know like to 90 and 290 x7 9:57 nine seventy four seventy and 480 are the same family ones the strongest when they cut it down slightly usually the slightly cut down car like Vega 56 versus Vega 64 is only 10% weaker and it's usually like two thirds the price so I remember driving back on a business trip and I sold my 390x to a guy for $300 and it's cuz I knew the 470 was gonna come out and I was gonna be able to overclock it to about the same performance and just basically save 150 bucks and it was funny because at the last minute everyone knew the 470 was gonna be $150 they're released I mean there were minor statements even from AMD personnel one you know 200 for the 484 gigabyte 234 the 488 gigabyte and 150 for the 470 and I remember I believe was the Tom's Hardware reveal where they said at the last minute it was two hours before their abusement live and they were calling the AMD reps going seriously how much does this cost or reviews done we need to type the last paragraph and at the last minute they said $180 but it really wasn't 180 it sold for like 190 to 200 in AMD did this because they knew it would sell out and it did I barely got my hands on a high-end model of the 470 luckily it was an 8 gigabyte version with overclocked RAM that performed within 5% of a 8 gigabyte 480 so I basically got a 480 for a little less money but the fourth I mean that tells you how well the 480 was selling that at the last minute AMD effectively decided to for sell the 474 like 10% less just because they knew it would sell out and it was selling to minors as well that's worth pointing out this is what new mining boom was starting here so that's what's it that's really what's so interesting here is watching no matter what it was aim dear Nvidia 40 nanometers 16 animator it was selling like crazy because people just wanted the cards so badly so yeah that was 2014 through 2016 it was the long long wait to get the hell off of 28 nanometer it was also the start of watching Intel be stuck on there 22 and 14 nanometer nodes as well and just not be and just you just saw the complete destruction of Moore's law in this time period I mean it was already basically gone before but you know Moore's law was dead just like my channel name and this was the awkward period of watching AMD become known for its console and semi custom manufacturing as they just ceded control of a lot of the consumer PC market because they had to again this was their Battle of Dunkirk and it was a lot of just laying the groundwork for the future here budget builds were great but top performance was slowing down and I guess the last thing I would really drive home here is how weird this period was like if you again were in the 2009 through 2011 era you saw 16 to 5 nanometer chips then he saw 55 nanometer cards like with the three eight seventy or four eight seventy and he saw 40 nanometer cards you know yet 40 nanometer cards in 2010-2011 and then you know and in 2011 you have 28 nanometer and then 2012 you had 28 nanometer and then 2013 year 28 nanometer and then 2014 year 28 nanometer and then 2015 you had 28 nanometer and that's not even true today think about it right if 2015 was 28 nanometer we got 14 and 16 nanometer in 2016 and then we got Vega 20 and 10 atti 16 and 14 nanometer in 2017 but at the end of 2017 Titan V came out that's 12 nanometer in 2018 we have 12 nanometer r-tx cards coming out and in 2019 now we have seven nanometer cards launching so you can't even say that this is just the new norm because it really isn't 28 nanometer overstate it's welcome and I don't know if we'll ever see that again we probably will in the 5 to 3 and nanometer era in about two years but this was really just an abnormal time moving on to my next video we'll get into finally more CPU heavy discussion we're GPU has become completely boring but CPUs become fascinating so this was a lot of work this was my longest video in this series but I really thought of it as one era please like subscribe and for the love of God please share these videos if you liked them leave your comments below I love seeing other people's opinions of what they remember from this era yeah like subscribe and thank you

10 Years of Being a PC Gamer (Part 4): The long wait for 16nm…

7 thoughts on “10 Years of Being a PC Gamer (Part 4): The long wait for 16nm…”

  1. In my view, the 750ti really changed what was possible for not only budget builds, but also in fitting with modest cooling, and the relative handfuls of low-profile editions being sold. Before it came out, the most powerful card from that time (AFAIK) was the HD 6570. This made a near-instant splash, so one could buy a PC from an office or classifieds page made as early back as with a Core2 Duo could run games at 1080p. Even today, it just marginally beats out the GT 1030, which peaked at 30W, just a little over half of the PCI limit at 75W. Not to mention that the 1030 can also be passively cooled at that level of power draw.

  2. Could it be that Lisa Su redirected financial ressources from the Radeon Group to the CPU branch of AMD because she estimated the CPU market to be more lucrative than staying competitive in the GPU space? I´d really like to see you cover the reasons for the downward trend of the Radeon Group in the past 5 years in a future video.

  3. I still have my 7850k overclocked @4.7ghz, I played so many games on that thing. It wasn't really that bad once I dialed in the memory settings (I ended up with the ddr3 at 2520 with decent timings).

    I also bought a 470 8GB, loved the card so much. I started doing all kinds of stuff to it. Ended up at 1495mhz with a Kraken X62 cooling it and the card had Samsung memory, so I could run it at 2100 without having the timings change. It's actually faster than my 580 8gb card. It upset so many people to see my 470 matching higher class gtx cards at the time in the 3dmark ladder. https://www.reddit.com/r/Amd/comments/6rbvlc/how_the_rx470_8gb_kraken_x62_roughly_compares_to/

    Great times. Looking forward to more fun on 7nm.

  4. Dude, Athlon 5350 was sooo bad. It was hard to use it even in browser environment. It support only single channel memory, which is helping for the bad experience. I sold the platform after a year. But good point. It's ok if you know what you are getting.

    I had R9 Fury X 2016-2017. It was ok but the memory was starting to make problems in some AAA games. So, I bought on purpose GTX 980 Ti, overclocked it to 1450/8000 and the card to this date is giving me 20 to 30% above the performance of R9 Fury X @ 1100/1050. Im still using GTX 980 Ti and today I bought Anthem to play it. I can play it on 1080p Ultra no problem. So yeah – Maxwell was the start of th falling of Radeon. And now with Pascal and Turing the things are even more bad. It is depressing yeah… Im in my core a Radeon user and I want good GPUs from them. Something big need to change in RTG.

    40min. video wow. We are listening, so keep the good work.

  5. Built my first pc in 2016, my friend gave me an FX 8350, 8gb ddr3, a 120mm water cooler, and a motherboard for free. I remember debating on whether to wait for an RX 480 or buy a 970 now. Should’ve went with a 480 because I could’ve sold it for 500$ a few months later but I let my nvidia bias friends convince me otherwise. In 2017 I upgraded to my current CPU an R5 1400 and a used GTX 980 for 150$ bought both in April so I was an early adopter of Ryzen. Now I got a 1070 for 200$ looking to upgrade to Zen 2. I should’ve invested I knew Ryzen had to be good and for 5$ a stock it was stupid low, hindsight’s a bitch hahaha

Leave a Comment