Jump to content

Search the Community

Showing results for tags 'tech'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Welcome
    • Announcements & News
    • New Arrivals
  • Star Citizen - Roberts Space Industries
    • General Discussion
    • Ship Discussion
    • Professions
    • Multimedia and IT
    • Modding
    • Fleets & Recruitment
    • Role Playing & Fan Fiction
    • Off Topic
    • Imperium Embassy
  • SCB Marketplace
    • Ships & Packages
    • Other Items
  • Imperium Fleet

Product Groups

  • One Time Donations
  • Monthly Donations

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


Website URL


ICQ


Yahoo


Jabber


Skype


Twitch


Location


Interests

Found 9 results

  1. Cool Vid that uses a SC image at 3.33m into video (then soon after Ex Machina) ---
  2. Heat = Energy. Use it to your advantage.
  3. CIG has decided to drop DirectX 12 and go with Vulkan API, specifically because using DiractX12 would restrict their user base to only Windows 10 https://www.neowin.net/news/star-citizen-plans-to-drop-directx-and-exclusively-use-vulkan-graphics-api "Cloud Imperium Games' upcoming space simulation game Star Citizen is en route to discontinue support for DirectX 11 and switch to the Vulkan API, while also abandoning previous plans of supporting DirectX 12. The Director of Graphics Engineering at Cloud Imperium Games, Ali Brown, revealed this piece of information on the official Star Citizen forums, going on to say that the Vulkan API's support for Windows 7, 8 and 10, as well as Linux, were the main factors in making this decision, as using DirectX 12 would restrict their user base to only Windows 10 users. For those unaware, DirectX 12 and Vulkan are low-level graphics APIs that, if used correctly, can utilize the GPU much more effectively than previous iterations of DirectX and OpenGL. This leads to better frame rates and lower CPU usage in games. However, Brown doesn't dismiss the possibility of DirectX 12 support for Star Citizen in the future. As he puts it, "DX12 would only be considered if we found it gave us a specific and substantial advantage over Vulkan". He also mentioned that the two APIs "really aren't that different". Considering the similarities between the graphics APIs and the ongoing development of Star Citizen, Cloud Imperium Games could easily change their stance on the exclusivity to Vulkan. Although, as Vulkan already supports Windows 10, focusing development time on a single pipeline could be the optimal path to take, since the developers are probably hoping to avoid further delays to Star Citizen. In fact, we probably won't have to wait much longer to find out if the developers will have a change of heart on this decision, as Star Citizen's single player campaign component, Squadron 42, is aiming for a release this year, at least according to the official website, barring any hold-ups that have plagued the game since its inception." Main Source: https://forums.robertsspaceindustries.com/discussion/comment/7581676/#Comment_7581676
  4. I found this very interesting and would like some thoughts on this. As someone who games on a laptop, I might have to try this out for SC and other games.
  5. Impressed with Corsair this year, a lot better than 2015 PAX
  6. WHAT IS MU-MIMO? Does the term MU-MIMO ring any bells? If it doesn’t, it soon will. Ready or not, this new Wi-Fi standard update is set to revolutionize the wireless networking world forever. Let’s take a look at the technology and what makes it so cool for the average consumer. Sharing isn’t easy To get a proper understanding of how MU-MIMO works, let’s take a look at the way that a traditional wireless router handles data packets. Routers – probably including the one you use – are very good at sending and receiving data, but only in one direction. In other words, they can only talk to one device at a time. If you are getting video streamed onto your computer, your router cannot also be streaming online video gameplay to a console. “But wait,” you’re saying, “I can’t count the number of times that I’ve played video games while my SO watches Netflix on the laptop, so this is clearly wrong.” Yes, you can run multiple devices on your Wi-Fi network – otherwise, what would be the point? But older routers accomplish this by acting like a machine gun mounted to a merry-go-round. They rattle off bits of data very, very quickly to multiple devices in turn. Much like how film tape looks like a constantly moving image instead of a bunch of stills, accessing Wi-Fi feels like a constant connection to the Internet because the router switches between devices so quickly. However, it can only pay attention to one device at one time, which is one reason why Internet quality goes down if Wi-Fi bandwidth is too low. This works. It works so well that we rarely think about it. But the people who do think about it have long believed that it could work better. What if you could have a router that transmits data to multiple devices simultaneously? Wouldn’t it increase efficiency, be faster, and allow for more interesting network configurations? So advances like MU-MIMO were developed and eventually included into today’s wireless standards. These developments allow advanced wireless routers to communicate with several devices at once. A brief history: SU vs. MU Let’s talk about SU-MIMO vs. MU-MIMO. They sound like Pokémon, but are actually different ways to get routers to talk to multiple devices. SU-MIMO (single user, multiple input, multiple output) is older. The SU standard allowed routers to send and receive multiple streams of data at the same time, based on the number of antennas it had. Each antenna could handle a different device. Of course this required new lines of router with multiple antennas, but manufacturers had been looking for an excuse to stick some more knobs and points on their routers anyway. SU was included in the 802.11n standard update of 2007, and was slowly introduced in new product lines. However, SU-MIMO did have constraints in addition to antenna requirements. While multiple devices could be connected, they were still dealing with a wireless router that could only pay attention to one device at a time – our machine gun and merry-go-round analogy. Data speeds went up, interference became less of a problem, but there was still a lot of room for improvement. MU-MIMO (multiuser, multiple input, multiple output) is the standard that evolved out of work with SU-MIMO and SDMA (Space Division Multiple Access). Before you zone out on all these acronyms, we’ll cut to the chase and say that a lot of smart techies in universities and research labs created a way for routers to hold multiple conversations at the same time. With MU, the wireless router base station is able to communicate with multiple devices using a separate stream for each, as though each device has its own personal router. Read: TP-Link shows off new Archer C2600 router with MU-MIMO support Eventually MU became feasible enough to be added to an 802.11ac update in 2013. After a few years of product designing, manufacturers started incorporating the feature into their lines. Benefits of MU-MIMO MU-MIMO is an exciting development because it has a noticeable impact on everyday Wi-Fi use without directly changing bandwidth or other key factors. It just makes networks much more efficient. When using this new standard, you don’t need multiple antennas porcupining out of your router, and have a more stable Wi-Fi connection for your laptop, phone, tablet, or computer. Every device can be spoiled child and avoid sharing its data stream with anyone else. This is particularly noticeable when streaming video or performing other demanding tasks. Your Internet speed feels faster and more dependable, although it’s really just smart networking at work. You may also be able to use more devices on your Wi-Fi at once, which is handy when friends come over to visit. MU-MIMO may be great, but it also has a couple limitations worth mentioning. Current standards support four devices, but add more than that and devices will have to share a stream, which brings us back toward SU-MIMO problems. It is primarily used in downstream communication, and is limited when it comes to upstream (at least for now). Also, MU has to juggle more information about your devices and channel states than previous standards. This makes managing and troubleshooting MU networks more complicated. How to get MU-MIMO Buy a new router and computer. Okay, it may be a little more complex than that, but you get the gist. This year manufacturers started showing off their MU-MIMO prototypes at trade shows and getting ready to announce product release dates. By 2017, you can expect it to be a common feature. Read: Netgear is releasing the Nighthawk X8, A Wi-Fi router supporting up to 5.3GBps When buying a wireless router, look for “MU-MIMO” in the product description or specs. As with other new features, marketers like using it as a value point, so it should be pretty easy to tell if your router can handle MU networking based on the blurbs. One of the few examples on the market in 2015 is the Linksys EA8500 Max-Stream, if you want to buy right away. Read: Linksys starts taking preorders for its lightning fast EA8500 router Of course, if you wait MU-MIMO will eventually come to you. This is a smart move, because most today’s wireless devices aren’t made for MU-MIMO – they force routers to revert to SU-MIMO. Until all your devices support the MU, it won’t do much good. But that day is coming. And Now: The Motherboard to Support MU-MIMO
  7. 42" of goodness. IPS panel. 5ms response time. 3840x2160 at 60Hz (default, can overclock) http://www.amazon.com/WASABI-MANGO-UHD420-Real-HDMI/dp/B00YA5IZS0
  8. Donut

    Intel Xeon or i7

    Recently been doing some research and it looks like the Xeon processors are higher quality or "binned" processors of the same chip that an i7 runs off of. If that's the case, and people use them for servers mainly, why aren't we buying them up? I'm debating getting a Xeon 1620 or a 1650 over a 5820k i7. My main reason is that, I leave my computer on 24/7. It basically never turns off unless I'm doing updates or cleaning it. I also feel like the higher quality will make it a longer lasting CPU, since people don't normally upgrade it seems for 5-6yrs on the CPU/MB/RAM. I also feel that going with quad channel memory is the way to go for 64bit gaming, such as SC and other games are going these days. Thoughts? Suggestions? The main thing is that I want to have a decently high overclock and never have to turn my PC off, which is why I'm considering it. As far as gaming, it seems to be a higher quality i7 so there should be no performance lost, maybe even gained. (And yes I know there will be people asking why I need more than 4 cores. In my experience when I had an AMD 1100T, it performed better than the i7 970 at the time which still only had 4 cores. That's the reason behind the 2011-3 socket I'm going with) 5820k: http://pcpartpicker.com/part/intel-cpu-bx80648i75820k Xeon 1620: http://pcpartpicker.com/part/intel-cpu-bx80644e51620v3 Xeon 1650: http://pcpartpicker.com/part/intel-cpu-bx80644e51650v3
×
×
  • Create New...