Jump to content

Search the Community

Showing results for tags 'intel'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome
    • Announcements & News
    • New Arrivals
  • Star Citizen - Roberts Space Industries
    • General Discussion
    • Ship Discussion
    • Professions
    • Multimedia and IT
    • Modding
    • Fleets & Recruitment
    • Role Playing & Fan Fiction
    • Off Topic
    • Imperium Embassy
  • SCB Marketplace
    • Ships & Packages
    • Other Items
  • Imperium Fleet
  • Lantern Watch's Topics
  • NEBULA HEAVY INDUSTIRES's Topics
  • Blood Moon's Topics
  • Triton S.O.'s Topics
  • Alaris's Topics
  • Alaris's Unit Coordination
  • Alaris's Unit Discussion
  • Hermes Squadron's Topics
  • Hermes Squadron's Unit Coordination
  • Hermes Squadron's Unit Discussion
  • Alexis Order's Topics
  • Alexis Order's Unit Coordination
  • Alexis Order's Unit Discussion
  • CEL's Topics
  • CEL's Unit Coordination
  • CEL's Unit Discussion
  • Apheliun's Topics
  • Rogue Squadron's Topics
  • Tiqqun Engineering's Topics

Calendars

  • Community Calendar
  • Lantern Watch's Events
  • NEBULA HEAVY INDUSTIRES's Events
  • Blood Moon's Events
  • Triton S.O.'s Events
  • Alaris's Events
  • Hermes Squadron's Events
  • Alexis Order's Events
  • CEL's Events
  • Apheliun's Events
  • b's Events
  • Rogue Squadron's Events
  • Tiqqun Engineering's Events

Product Groups

  • One Time Donations
  • Monthly Donations

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


Website URL


ICQ


Yahoo


Jabber


Skype


Twitch


Location


Interests

Found 19 results

  1. Boildown

    Ongoing Discussion AMD 5000 series / Zen 3 Details

    https://www.anandtech.com/show/16148/amd-ryzen-5000-and-zen-3-on-nov-5th-19-ipc-claims-best-gaming-cpu This is when I finally switch teams to AMD for the CPU (lol I don't think AMD's GPU will be any good though). Amazed that the clock speed keeps going up at 16 cores over 12 cores. Need to look at it closer to see if there's any gotchas. Will be waiting for reviews before buying as always. But AMD is about to wipe Intel's last advantage away. I'm still on my i7-4770K, so this will be a huge upgrade.
  2. Gallitin

    Build help New Build

    My old build was passed down and time for a new one, parts get here Thursday to start assembling. CPU: Intel Core i9-9900K Coffee Lake 8-Core, 16-Thread, 3.6 GHz CPU Cooler: Corsair Hydro Series, H150i PRO RGB, 360mm Motherboard: ASRock Z390 Taichi Ultimate LGA 1151 Memory: CORSAIR Vengeance LPX 32GB (4 x 8GB) 288-Pin DDR4 SDRAM DDR4 3000 (PC4 24000) Video Card: GIGABYTE GeForce RTX 2070 GAMING OC 8G Graphics Card, 3 x WINDFORCE Fans, 8GB 256-Bit GDDR6 Hard Drive: SAMSUNG 970 PRO M.2 2280 512GB Power Supply: CORSAIR RMx Series RM850x CP-9020180-NA 850W Tower: Thermaltake View 71 RGB 4-Sided Tempered Glass Vertical GPU Modular E-ATX Gaming Full Tower Extra Fans: Corsair HD Series, HD120 RGB LED, 120mm I'd like to get more in depth with overclocking this one. Assuming it will stay more than cool enough to push it some. Anyone have experience with this type of mobo's settings to give some suggestions?
  3. Gremlich

    Ongoing Discussion Compare 9900K vs 2700X

    I checked just to see how the 2700X fared against the i9-9900K This will not dissuade Intel fans. The i9-9900K is faster, by an overall 16%. Of course Intel does single-core better. Cost for the 2700X - $280 Cost for the i9-9900K - $500 Whilst the 6 core, 12 thread 8700K beats the 2700X in single and quad core performance by about 10%, the 2700X wins on multi-core workloads. PC gaming and desktop performance is generally governed by four or less cores but the 2700X offers unbeatable value for money for workstation users. Although the 9900K has around a 16% effective speed advantage over the 2700X, the Ryzen 2700X offers better value for money for most users.
  4. Tillion

    Ongoing Discussion Lets see your Rigs!

    Lets get an updated thread of everyone's custom Rigs that they will be using and/are already using to play SC. Please attach a hardware list and note any future upgrades you'll be making. Probably try to limit to 2-4 pics per post to limit load time on smaller devices and tag your chipset (AMD or Intel) in the tagline. Cant wait to see what you guys are running! I've already posted this on my profile but I'll start anyway. ------ CASE -- Phanteks Enthoo Evolv PH-ES515E_GS Aluminum/Steel Galaxy Silver Window ATX Mid Tower Computer Case MoBo -- MSI Z170A KRAIT GAMING R6 SIEGE LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 ATX Intel Motherboard CPU -- i7 6700K 4.00 GHz Unlocked Quad Core Skylake RAM -- CORSAIR Vengeance LED 32GB (4 x 8GB) 3000, Cas Lat 15 CPU COOLER -- DeepCool CPU Liquid Cooler CAPTAIN 240 WHITE BOOT DRIVE -- Samsung 960 evo 500GB M.2 GPU -- MSI 1080ti FE PSU -- Corsair RMi Series RM850i MODS CABLES - CableMod custom sleeves LIGHTING- Two 12" Pure White Cold Cathode GPU- modded w/ Corsair H55 and NZXT Kraken g10 MONITOR -- Samsung UHD 4k 28" FANS – Triple Fan Mod for top of case
  5. Headline really sums it up. https://www.tweaktown.com/news/59654/intel-launches-next-gen-optane-ssd-900p/index.html
  6. Several reviews of the upcoming Intel 8th gen CPU's have leaked, and they look like AMD needs to watch out. In the reviews (with synthetic benchmarks here) it looks like the 8700K manages to slightly outperform the vaunted 1800X at multicore. If this is true, the 8700K is an incredible CPU, as it actually manages to outperform a CPU with 25% more cores and a (presumed) +150 euro pricetag, depending on where you live. Not to mention that it manages to seriously outpace it in gaming. Could Intel have stolen the price/performance crown from AMD here? Lastly, the reviews state that the 8600K was overclocked to 5.1GHz. While this is great, the CPU got up to temps of 92 degrees C. Still. Perhaps Intel learned from their critics regarding TIM. October 5th will be an interesting day, guys.
  7. Boildown

    Ongoing Discussion Intel "Kaby Lake" CPU Reviews Are Out

    Regular reviews are out. Many reviews at this link: http://www.hardocp.com/news/2017/01/03/intel_core_i77700k_kaby_lake_processor_review_roundup Here's another they missed: http://www.anandtech.com/show/10959/intel-launches-7th-generation-kaby-lake-i7-7700k-i5-7600k-i3-7350k It looks like the bottom line is that Kaby Lake's high end CPUs give the same performance as their predecessor (Skylake), but perhaps overclock a little better. A few other features that probably won't matter to many people. However the i3-7350K (an unlocked i3) is the real surprise for the mid-range or value segment. Cheap and overclockable. Probably a great CPU for the gamer on a budget. I didn't see this one coming and most everyone that can't spring for the i5-7600k should be buying this instead.
  8. Hazzbeen

    The Pointmen

    Hi I am new here and being prior military infantry (Marine Corps 0341) myself I thought of patrols and the importance of a good pointman / pointmen / lead convoy vic. The people at the front of the formation that see everything first and report it to the command unit/center that may or may not be in sight of the objective. w/ fleets maybe they are 1-2 jump points ahead letting the still in quantum fleet know what they are about to be headed into or recon and scout the path in teams and make intel reports that could alter the entire fleets route. Maybe they are scouting for a stealth bomb run to hit a pirate trade post. Who knows?.. sky's the limit.or...space.....w/e Maybe a 2 man team with a hornet ghost/hornet tracker. or any other combination of stealthy or ships w/ enhanced scanning capabilities. Maybe they can jump to a system that the fleet cannot due to jump point size limitations or what not. I am not sure if someone else has posted anything like this. If so please drop a link. I would love to brainstorm a bit.
  9. Next up on Tactics - see -- >> Multi-Squadron Battle Tactics - pt 4 - Intel and Diplomacy
  10. SSDs may be outdated by the time Star Citizen launches thanks to Intel Optane Intel Optane 3D Xpoint will make game load screens “a thing of the past” by Edward Chester 19 August 2015 intel Optane 3D Xpoint Star Citizen creator Chris Roberts has described how Intel’s upcoming new storage technology, Optane, will eliminate game loading screens forever. Speaking at the Intel Developer Forum (IDF), Roberts highlighted how the new superfast 3D Xpoint memory will provide such fast data access that games will no longer need to load levels into memory a chunk at a time but will be able to access the data near instantly straight from storage. The result will be not just the elimination of level loading screens but the pauses that sometimes occur in games as you move from one environment to the next. Roberts, who is CEO and founder of Imperium Games and who previously worked on space-exploration game Wing Commander, was particularly excited about the impact it would have on his latest creation, Star Citizen. The large-scale space exploration and combat title was announced in 2013 along with a crowdfunding campaign to get the game started and has since gone on to be the most successful crowd-funding campaign ever, raising over $87 million. It involves moving seamlessly from vast space environments to the surface of planets, which is a task that requires a huge shift in the data that’s being referenced – from vast networks of stars and planets to detailed rolling terrain. With Optane memory this transition will be smoother than ever before. Optane is actually the brand name for Intel’s new range of storage devices, platforms and software that support its new memory technology, 3D Xpoint. This new technology has the non-volatile, long term data-storage abilities of the NAND memory used in SSDs but is also much faster to access – theoretically up to 1000x faster. This means it can potentially replace system memory completely, rather than the current arrangement where data is permanently stored in a HDD or SDD and is loaded into system memory when accessed by the computer’s processor. Although there will be advantages to this speed immediately, it may be some time before it’s true benefits are realised in games as for the time being many games will be architected around the limitations of more average PCs and games consoles. Both Star Citizen and 3D Xpoint memory are expected to arrive in 2016, with the former aready available in beta form and the latter set to arrive both as a drop in replacement for traditional SSDs and as a DIMM module available for use in high performance computing applications, where it will replace system memory altogether.
  11. With the fact now nearly everyone buys a new graphics card after one or three years, Who remembers the first or second ever Graphics card? I was 18 when i bought my 1st Card, Orchid Righteous 3D - 3dfx Voodoo 4MB for just £105 GBP, it was only used in 3D games, when in Windows, the 2D card was used. You could hear the mechanical relay click in when you went into a 3D game like, GTA, Carmageddon, or Jedi Knight. Of course come 2000 i was need of a serious Hardware upgrade, i built a new PC (AMD K6-2 450Mhz). I bought my second ever GPU it was the Creative Labs, Nvidia Riva TNT2 Ultra 32Mb for a mere £150 GBP . I can actually remember every GPU purchase for the past 18 years, but hey thats gaming for ya. What was the first video card you ever bought?
  12. Intel Announces Thunderbolt 3 - Thunderbolt Meets USB source: http://www.anandtech.com/show/9331/intel-announces-thunderbolt-3 A lot has been happening in the world of external communication buses over the past year. In the last 12 months the USB consortium has announced both 10Gbps “Superspeed+” USB 3.1 and the new USB Type-C connector, USB’s new compact, reversible connector that is designed to drive the standard for the next decade or more. Meanwhile with the introduction of USB Alternate Mode functionality – the ability for USB Type-C to carry other protocols along with (or instead of) USB Superspeed data – has made USB more flexible than ever, with the VESA announcing that DisplayPort will be supporting alternate mode to deliver DisplayPort video over USB Type-C ports and cabling. As a result, the introduction of USB Type-C has led to a definite and relatively rapid transition over to the new standard. With the USB consortium having designed a very capable and desirable physical layer for Type-C, and then alternate modes allowing anyone to use that physical layer, there have been a number of other technologies that have started aligning themselves with USB in order to take advantage of what is becoming an even more common platform for external buses. USB Type-C Connector On Apple's MacBook This brings us to today, with the announcement of Thunderbolt 3 from Intel. With the advancements occurring elsewhere in the world of external communication buses, Intel has not been sitting idly by and letting other standards surpass Thunderbolt. Rather they have been hard at work on the next generation of Thunderbolt, one that in the end seeks to combine the recent developments of the USB Type-C physical layer with all of the feature and performance advantages of Thunderbolt, culminating in Thunderbolt 3 and its incredibly fast 40Gbps bus. As a bit of background, the last time Intel updated the Thunderbolt specification was in 2013 for Thunderbolt 2, AKA Falcon Ridge. By aggregating together two of Thunderbolt 1’s 10Gbps channels, Intel was able to increase the available bandwidth over a single channel from 10Gbps to 20Gbps, at the cost of reducing the total number of channels from two full duplex channels to one full duplex channel. Of particular note here is that with Thunderbolt 2 the Thunderbolt signaling layer didn’t change – Thunderbolt 2 still operated at 10Gbps for each of its four underlying lanes – so in reality the Thunderbolt signaling layer has remained unchangedsince it was introduced 2011. Now at 4 years old, it’s time for the Thunderbolt signaling layer to change in order to support more bandwidth per cable than what Thunderbolt 1 and 2 could drive. To accomplish this upgrade in signaling layers, Intel has needed to change the physical layer as well. Thunderbolt 1 and 2 used the Apple-developed mini-DisplayPort interface for their cables, but with the VESA signaling that it may eventually replace the DisplayPort physical layer with USB Type-C, the DisplayPort physical layer’s days are likely numbered. Consequently mini-DisplayPort’s days are numbered as well, as consumer devices and the development of new standards both shift over to Type-C. This has put Thunderbolt in an interesting situation that has Thunderbolt moving forwards and backwards at the same time. As originally planned, Intel wanted to have Thunderbolt running through USB ports, only for the USB consortium to strike down that idea, resulting in the shift over to mini-DisplayPort. Now however with the waning of DisplayPort and the introduction of USB Type-C and its alternate modes, Thunderbolt is back to where Intel wanted to start all along, as a standard built on top of the common USB port. The end result of this upgrade of virtually every aspect of Thunderbolt is the latest generation of the technology, Thunderbolt 3, which seeks to combine the strengths and capabilities of the Thunderbolt platform with the strengths and capabilities of USB Type-C. This means bringing together Thunderbolt’s very high data speeds and the flexibility of its underlying PCI-Express protocol with the simple, robust design of the Type-C connector, all enabled via the USB alternate mode specification. Throw in Type-C’s associated power delivery standards, and you have what Intel believes to be the most powerful and capable external communications bus on the market. Along with the change to using the USB Type-C port, the big news here is that Thunderbolt 3 is doubling the amount of bandwidth available to Thunderbolt devices. With Thunderbolt 2 topping out at a single full duplex 20Gbps channel, Thunderbolt 3 is increasing that to 40Gbps. Compared to DisplayPort 1.3 and USB 3.1, this is 1.5 to 4 times the available bandwidth, with DisplayPort 1.3 topping out at 25.9Gbps (after overhead), and USB 3.1 topping out at 10Gbps per channel (with Type-C carrying 2 such channels). From a signaling standpoint, Thunderbolt 3 is being implemented as a USB alternate mode, taking over the 4 lanes of high-speed data that Type-C offers. This is the same number of lanes as Thunderbolt 1 and 2 used, so the bandwidth increase comes as a result of doubling the amount of data carried per lane from 10Gbps (half duplex) to 20Gbps. Which when aggregated at either end is what gives us 20Gbps full duplex. To handle the new Type-C interface and the increased data rates, Intel is rolling out a new type of active cable for the new Thunderbolt standard. Like previous generation cables, the new cable includes significant active electronics at both ends of the cable, allowing Intel to achieve greater bandwidth than what passive cabling would allow, at the cost of increased cable prices. The new cable retains the distinctive Thunderbolt logo and is a bit larger than a passive at both ends to accommodate the electronics, but other than the change to the Type-C port is similar in concept to Thunderbolt 1 and 2’s active cables. Meanwhile because it’s built on Type-C, Thunderbolt 3.0 will also introduce support for passive cabling using the now-standard Type-C cable. When using a Type-C cable, Thunderbolt drops down to 20Gbps full duplex – the amount of bandwidth available in a normal Type-C cable today – sacrificing some bandwidth for cost. With Type-C cables expected to eventually cost only a few dollars compared to thirty dollars or more for traditional Thunderbolt cables, this makes Thunderbolt far more palatable as far as cable costs go, not to mention allowing cables to be more robust and more easily replaced. Driving these new cables in turn will be Intel’s new Alpine Ridge controller for Thunderbolt 3. The latest generation of the Ridge family, this controller steps up in capabilities to match Thunderbolt 3’s 40Gbps speeds. Alpine Ridge also integrates its own USB 3.1 (Superspeed+) host controller, which in turn serves dual purposes. When serving as a host controller for a USB Type-C port, this allows Alpine Ridge to directly drive USB 3.1 device if they’re plugged into an Alpine Ridge-backed Type-C port (similar to how DisplayPort works today with Thunderbolt ports). And when serving as a device controller (e.g. in a Thunderbolt monitor), this allows devices to utilize and/or offer USB 3.1 ports on their end. The addition of USB host controller functionality further increases the number of protocols that Thunderbolt 3 carries in one way or another. Along with PCI-Express and DisplayPort, the use of Alpine Ridge ensures that USB 3.1 is also available, as it’s now a built-in function of the controller. The only notable difference here is that while DisplayPort video and PCI-Express data encapsulated in the Thunderbolt data stream, USB 3.1 is being implemented on top of the PCI-Express connection that Thunderbolt already carries rather than being encapsulated in the Thunderbolt data stream as well. Speaking of encapsulation, Thunderbolt 3 also includes an update to the DisplayPort side of matters, though likely not what everyone has been expecting. With the increase in bandwidth, Thunderbolt 3 is able to carry twice as much video data as before. However Intel is not implementing the latest version of DisplayPort – DisplayPort 1.3 – in to the Thunderbolt 3 standard. Instead they are doubling up on DisplayPort 1.2, expanding the number of equivalent DisplayPort lanes carried from 4 to 8, essentially allowing one Thunderbolt 3 cable to carry 2 full DisplayPort 1.2 connections. The end result is that Thunderbolt 3 will not be able to drive the kind of next-generation displays DisplayPort 1.3 is geared towards – things like 8K displays and 5K single-tile displays – but it will be able to drive anything 1 or 2 DisplayPort 1.2 connections can drive today, including multiple 4K@60Hz monitors or 5K multi-tile displays. Meanwhile gamers will be happy to hear that Intel is finally moving forward on external graphics via Thunderbolt, and after more than a few false starts, external GPUs now have the company’s blessing and support. While Thunderbolt has in theory always been able of supporting external graphics (it’s just a PCIe bus), the biggest hold-up has always been handling what to do about GPU hot-plugging and the so-called “surprise removal” scenario. Intel tells us that they have since solved that problem, and are now able to move forward with external graphics. The company is initially partnering with AMD on this endeavor – though nothing excludes NVIDIA in the long-run – with concepts being floated for both a full power external Thunderbolt card chassis, and a smaller “graphics dock” which contains a smaller, cooler (but still more powerful than an iGPU) mobile discrete GPU. Another concept Intel has been floating around that will finally be getting some traction with Thunderbolt 3 is Thunderbolt networking. By emulating a 10GigE Ethernet connection, 2 computers can be networked via Thunderbolt cable, and with 10GigE still virtually unseen outside of servers and high-end workstations, this is a somewhat more practical solution for faster-than GigE networking. Thunderbolt networking has been around since 2013 in OS X, and in 2014 Intel demonstrated the technology working on PCs, however since it was a feature added to Thunderbolt 2 after its launch, the number of PCs with the necessary drivers for Thunderbolt networking has been quite low. With Thunderbolt 3 this is now a standard feature at launch, so system support for it should be greater. Moving on, by building Thunderbolt 3 on top of USB Type-C, Intel is also inheriting Type-C power delivery capabilities, which they will be making ample use of. With Type-C’s Power Deliver 2.0 specification allowing for chargers that can supply up to 100W of power, it will be possible (though optional) to use Thunderbolt 3 to deliver that same power, allowing for uses such as having a Thunderbolt dock or display charge a laptop over the single Thunderbolt cable (the one thing Apple’s Thunderbolt display can’t do today with Thunderbolt 2). That said, the USB power delivery standard is distinct from Thunderbolt’s bus power standard, so this doesn’t necessarily mean that all Thunderbolt hosts can provide 100W of power, or even any USB charging power for that matter. For standard bus-powered Thunderbolt devices, the Thunderbolt connection will now carry 15W of power, up from 10W for Thunderbolt 2. Finally, with the change in cabling, Intel is also clarifying how Thunderbolt backwards compatibility will work. Thunderbolt 3 to Thunderbolt adapters will be developed, which in turn will allow Thunderbolt 1/2 and Thunderbolt 3 hosts and devices to interoperate, so that older devices can work on newer hosts, and newer devices can work on older hosts. Though we’re not clear at this time whether the adapter is providing a simple bridge between the cable types (with the necessary regeneration), or if there’s going to be a complete Alpine Ridge controller in the adapter. Wrapping things up, Intel tells us that they expect to see Thunderbolt 3 products begin shipping by the end of the year, with a larger volume of products in 2016. Given this timing we’re almost certain to see Thunderbolt shipping alongside Skylake products, though Intel is making it clear that at a technical level Skylake and Thunderbolt 3 are not interconnected, and that it would be possible to pair Alpine Ridge Thunderbolt 3 controllers with other devices, be it Broadwell, Haswell-E, or other products. As for whether Intel will see more success with Thunderbolt 3 than the previous versions of Thunderbolt, this remains to be seen. The switch to a Type-C port definitely makes it a bit easier for OEMs to stomach – DisplayPort on laptops has been fairly rare outside of Apple – so now OEMs can integrate Thunderbolt without having to install a port they don’t see much value in. On the other hand this is still an external controller of additional cost, which incurs power, space, and cooling considerations, all of which would add to the cost of a desktop/laptop as opposed to pure USB 3.1. As was the case with Thunderbolt 1 and 2, Intel’s greatest argument in favor of the technology is docking, as the use of PCI-Express and now the addition of USB Power Delivery gives Thunderbolt a degree of flexibility and performance that USB Type-C alone doesn’t match.
  13. The chinese website benchlife.info released the schedule for Intels new Skylake CPUs. Regarding to the schedule the first desktop versions will be available August this year. The CPUs require a LGA1151 socket. Edit: I wrote that the CPUs will require DDR4 RAM which doesn't seem to be true.
  14. The new Skylake CPU will be released in August or October 2015 with the new 100 series chipset. 14nm manufacturing process New "K" unlocked quad core CPU DDR4 support
  15. Did not see this posted on the site, if it has been then please delete this thread. Nice little comparison, Enjoy! http://2p.com/9647554_1/Star-Citizen-GPU-Test-A-GTX750-Could-Give-You-30FPS-At-Max-Setting-1080p-by-Blake-Lau.htm http://gamegpu.ru/mmorpg-/-onlayn-igry/star-citizen-test-gpu.html Star Citizen GPU Test: A GTX750 Could Give You 30FPS At Max Setting, 1080p By Blake Lau on Oct 21, 2014 1681 1 Russian hardware site GameGPU recently tested what kind of GPU you may need to run Star Citizen at max setting at 1080p and above. The sandbox space sim is powered by CryEngine 3 and you need a gaming PC to play the game if you are looking for best experience. GameGPU was using a PC with Intel Core i7 3970X @4.9 ghz CPU, 16GB DDR3 RAM, and 120GB SSD for the test. Res: 1920х1080 Graphics: very high Res: 2560x1600 Graphics: very high Res: 3840х2160 Graphics: very high The hardware requirements aren't scary, and for GPU, a GTX750 could give you an average 30 fps at 1080p. But since the game features space dogfights and planetside FPS experience, some of you may consider 60fps the best experience, and in this case, you will need a GTX 780 (3GB). They also ran a CPU test and got the conclusion that CPU won't affect game performance too much when you have a graphics card like GeForce GTX 780Ti installed in your PC. Please note that the game is still in development and the current build doesn't represent final quality. Via: GameGPU
  16. So...I am planning to build a new computer and have heard good things about both of these CPUs, which do you guys think would be better for Star Citizen and why? I have heard that the Intel cpu is higher-quality and runs faster, but the AMD cpu has 8 cores, which could be useful (since SC will be able to use 8 cores). What are the advantages and disadvantages that you see? (other than the price tag) Here are links to both... i7-4770k - http://pcpartpicker.com/part/intel-cpu-bx80646i74770k FX-8350 - http://pcpartpicker.com/part/amd-cpu-fd8350frhkbox
  17. Hello there people. I'm starting a new intelligence concern in our favorite space sim and I'd like to help all of you with any intelligence gathering needs you might have! See you in the verse!
  18. Star Swarm is a real-time demo of Oxide Games’ Nitrous engine, which pits two AI-controlled fleets against each other in a furious space battle. Originally conceived as an internal stress test, Oxide decided to release Star Swarm so that the public can share our vision of what we think the future of gaming can be. The simulation in Star Swarm shows off Nitrous’ ability to have thousands of individual units onscreen at once, each running their own physics, AI, pathfinding, and threat assessments. Note:Star Swarm is not a deterministic simulation -- the AI and everything else is being computed in real time, so you will get slightly different results from multiple executions even on the same hardware. Unfortunately, achieving 100% determinism with the highly threaded nature of the Nitrous engine is an unrealistic goal. http://store.steampowered.com/app/267130/ So, lets start. I believe this will be a useful tool to stress our components and compare our system performance. Star Swarm supports mantle so it would be interesting way to compare AMD's API to Direct X.. I have a mediocre gaming rig that's a couple years old. Intel 2500k @ 4.6 8gb Corsair XMP @ 1600mhz Asus 560ti @ 900mhz Star Swarm settings Scenario: Follow Settings: Extreme Stress Test: Timed Run (6 min) Start D3D Average FPS: 26.99
×
×
  • Create New...