For those who do not necessarily want to build their own computer, or would be put off by the tedious workload, the Dell Inspiron 5675 is an exceptional choice that deserves recognition by those who are looking for a gaming PC. As such, often times the out of box experience is often negated by an additional upgrade or two but surprisingly that is not the case with the Dell 5675. It plays as it should and carries insane performance as it was intended to do as well in other extensive computational tasks. The computer itself has a fairly decent compact size. The construction felt pretty solid and it also has a manageable weight to it. and I did like the slit style chassis that showed off its blue LEDs inside (they can be turned off with the installed Dell Light Bar Software). As for the inside, there are plenty of upgradeable features that it has to offer such as two free 22x80mm M.2 slots, each labeled accordingly for SATA and PCIe (NVMe) configurations. In addition, there is one a 2.5'' slot for an SSD and a 3.5'' slot for a secondary mechanical drive. There is also one free DIMM slot for more memory. In addition, the motherboard has 3 extra SATA 6 Gb/s ports available and one more free PCIe electrical 16x slot. The motherboard is based off the X370 chipset was identified by CPU-Z as manufactured by Dell with a model number 07PR60. Lastly, the power supply is rated at 460 Watts and has 18A on the 12V rail.
For my gaming tests, I compared it to my five year machine that I built myself that houses a GTX 670 2GB card and a i7-3770K CPU. I will be using that machine to compare performance data. In addition, I used several software monitors including 3D Mark, Fraps, GPU-Z, MSI Afterburner, and HW Monitor. For the tests, I wanted to focus on how it played directly out of the box so I did not update any drivers before my performance tests. In addition, I tested three of the more demanding titles I own: Rise of the Tomb Raider (2016 PC), Crysis 3 (2013) and lastly Doom (2016). I focused on the presets in each game and disabled both film grain and motion blur where applicable. Everything was ran in full high definition at 1920x1080 resolution. I also set fan speed to 75%.
On Rise of the Tomb Raider, the opening scene on the mountain has to be one of the most intensive sequences of any game I own. Using the Very High Preset and expanding upon that by increasing shadow quality, sun soft shadows, reflection quality, and hair quality to max, the RX 580 was able to reach around 55-60 FPS. As the weather effects kicked in, I saw dips to low 50s and the further I progressed, it hit around high 40s, but it still felt smooth and was more than playable. As the mountain gave way, the lowest I saw was around 43 but it moved almost back to 60 once you were able climb the ice. On my GTX 670, I would start out at around 35 FPS and it would progressively get worst as the weather effects came in. I would start to dip to low 30s and once the mountain gave way it had dip to around 26 FPS. Aside from these, I also ran the in-game benchmark and my 670 posted an overall score of 36.63 FPS on regular high preset without the modifications mentioned above and on the RX 580 it posted 64.11 FPS overall. On Crysis 3, the opening sequence on the ship is quite intense with rain effects and high detail. On my test, I ran everything on very high settings and left anti-aliasing on FXAA. Once you acquire your first weapon, the frame rate jumps from 44 to almost 60 FPS. Usually when my character is stationary, I noted around 53-58 FPS. Moving down the platform it dropped to around 48, but once I started for the bridge, I saw areas of 53-64 FPS and it would alter greatly from depending on how I looked around. It is also a great improvement overall as in the same area the GTX 670 would struggle around 35 FPS and would bounce around in upper 40s with some places barely hitting 50. Crossing the bridge, it would hover around 45 FPS or lower. Once you are inside the ship, the RX 580 really stretched its legs and I would get mostly 60-70. Occasionally, there would be some random spikes down to the 50s or higher 40s, but overall the performance is night and day from before.Lastly, I tested the newest Doom and its performance nearly exploded on to my screen. I set everything to Ultra preset and set the anti-aliasing to TSSAA 8x as well as the 16x Anisotropic filtering. The frame rate usually averaged around 70-80 FPS. During the first big nest fight, I never saw the frame rate dip below 62 FPS. Connectively, on my 670 it ran with large 18-26 ms latency spikes and would hover between 38-60 FPS. On the RX 580, latency spikes never occurred. On the in-game metrics, both the CPU and RX 580 always held consistent green latency all less than 11 ms and never once showed any red latency data or lagged in game play.
In addition to games, I wanted to test out the multithreaded capabilities of the Ryzen 1700 and wanted to compare it to my i7-3770K. Specifically, I wanted to test video encoding. For this test, I used Handbrake 1.0.7 and ran two tests. The first was a 90 minute HD video encoded using the HQ 1080p30 preset. The second test was ran with a 24 minute HD video and this time I ran it using the Super HQ 1080p preset. This time I altered the reference frames from 4 to 5. I left the frame rate and the audio equal to the source and I ran the results:
Ryzen 1700: 1 hour and 38 Seconds (36.33 FPS) /i7- 3770K: 2 hours and 15 seconds (18.19 FPS)
Ryzen 1700: 24 minutes and 57 seconds (22.05 FPS) /i7-3770K: 47 minutes and 58 seconds (11.46 FPS)
As you can see the multithreaded performance obliterated my older CPU. The results were pretty staggering as I did not think it would cut the task in half on both tests. On a single threaded task, decided to encode a 24 minute WAV file to MP3 using Lame 3.99. I used MusicBee 2.4 and used a constant bit rate of 320 KBPS with the highest internal algorithm setting set at 0 (-q command). The results were ran twice:
Ryzen 1700: 125 Seconds/123 Seconds /i7- 3770K: 126 Seconds/126 Seconds
The last test I ran was the 3D Mark Firestrike benchmark using the basic version. My first results mentioned they were invalid due to an incorrect driver version. Still, to keep with the out of box experience I recorded the results and then I updated the drivers for validity. All games though were still tested with the stock drivers as mentioned previously. Here are the results:
Default Driver 16.11.5 - Overall: 10,890/Graphics: 12,522/Physics: 16,804/Combined: 4,248
Current Driver 17.7.2 - Overall: 11,044/Graphics: 12,885/Physics: 16,535/Combined: 4,299
My Computer: Overall: 6,309/Graphics: 7,021/Physics: 9,954/Combined: 2733
Overall, there is mostly an improvement, aside from a slightly lowered physics score and combined score. However, comparing the FPS from the default drivers, there was a reduction of 0.85 FPS on the physics score and a 0.22 FPS difference for the combined. One last thing I noticed during my tests with the default drivers, I noticed the RX 580 would frequently spike its memory clock between 300 MHz and 2 GHz on idle. Even after restarting the computer, the card temperature would still quickly rise past 60 Celsius. Couple this with a low default fan settings (25% @ 850 RPM) , the card could not properly idle. After I updated the drivers to 17.7.2, the spiking of the GPU memory was fixed and it held consistently at 300 MHz on idle. Afterwards, I monitored the temps at 25% fan after a cold boot and noticed that the temperatures were back within range around 28-36 Celsius . I strongly recommend to monitor your temperatures to make sure everything is alright. Despite the minor issue, it is easily fixed.
Aside from the performance, I do have some minor gripes with the system overall. The first is that the 16 GB RAM was not properly configured for dual channel (2x8 GB). It would have been beneficial had they included two 8 GB modules for proper bandwidth. The next problem I noticed was that the radeon software and GPU-Z both reported RX 580 was running only on PCIe 3.0 8x. It was strange since it should support 16x and the BIOS did not have a setting to change PCIe lanes. Nonetheless, the performance should remain unaffected as the scaling between 8x and 16x typically reflects little change if any at all on single GPU performance. Another problem I noticed was the BIOS is pretty basic. I would have loved to see an implemented GUI for the UEFI BIOS as you would normally see if you had built your own. I also noticed that there is only one headphone jack; a 3.5mm microphone port is mysteriously missing. Naturally, this might be an issue for some. I also would have liked to have a Blu-Ray burner drive instead of the slim DVD drive, though they are cheap enough as an upgrade. Lastly, I noticed that the hard drive activity light is missing from the front as well. Despite these minor issues, the Dell 5675 still performs exceptionally.
In conclusion, I would highly recommend the Dell 5675 to anyone who needs a fast computer but does not want to build their own or lacks the expertise to do so. The gaming performance out of the box is stellar and ran everything smoothly with no issues. Furthermore, it destroyed my i7-3770K in both multithreaded and in single threaded tasks. AMD has a real winner on their hands. Ryzen is workhorse, regardless of the criticism it has received for its single threaded performance. It is ridiculously fast and is a capable gaming CPU. I do not see a problem with sacrificing some IPC/single threaded performance for such a huge multithreaded gain, especially since games are slowly utilizing more cores. The gaming benchmarks at 1080p were top notch and it handled everything smoothly. At its price, I would say it is quite worth it, provided that you are content with the issues I highlighted above. With all else said, the machine is ready to perform to your needs.