FREE DOWNLOAD AUTO CAR
MORE INFO ABOUT AUTO CAR

Tuesday, January 8, 2013

AnandTech Article Channel

AnandTech Article Channel


Hands On with the Huawei Ascend W1, Ascend D2, and Ascend Mate

Posted: 08 Jan 2013 12:07 AM PST

There hasn't been much in the way of mobile handset news out of CES 2013 this year, save some announcements by Huawei this morning. I didn't get a chance to make it into the Huawei press event, but instead caught up with Huawei at Pepcom this evening and got my hands on the three devices that were announced. 

Huawei Ascend W1

Probably the most intriguing handset from Huawei is their first Windows Phone 8 device, the Ascend W1 (H883G). Huawei is entering the market with a decidedly entry-level set of specs with the W1, but it's an impressive first beginning. The W1 is based around a 1.2 GHz Snapdragon S4 Plus (MSM8230) SoC and 512 MB of LPDDR2, and looking at the menu, resolution on the 4-inch IPS display is WVGA. There's a 0.3 MP front facing camera and 5 MP rear facing camera as well. The W1 comes with 4 GB of internal storage, but behind the removable battery cover is a microSD card slot. Those specifications align pretty closely with other midrange or entry level WP8 devices, like the HTC 8S, and there's a definite place for them in the grand scheme of things.

The hardware is a bit thicker than I'd like for a WP8 device, but the tradeoff gets you removable battery, microSD, and (given it's Huawei) probably one of the most affordable price points around. There's no word on what that eventual pricing will be or whether any operators have already expressed interest but I'd say Huawei making inroads with its entry level WP8 device is inevitable. 

Huawei Ascend Mate

Next up are Huawei's two new Android phones, both based around (Huawei) Hisilicon's own K3V2 SoC. I've been trying to get an Ascend D1 in so we can take a closer look at Huawei's K3V2, which consists of 4 ARM Cortex A9 CPUs clocked at up to 1.5 GHz for one core, 1.2 GHz for all four, 16 core Vivante GPU, 2x32 LPDDR2 memory interface, and all built on 40nm process. I've been interested in the K3V2 for some time, as Huawei joins one of a small number of handset makers serious enough about being vertically integrated to make its own SoC and (in the case of the D2) also baseband.


The Ascend Mate positively dwarfs the 5-inch HTC Droid DNA

Anyhow the real key pointer with the Ascend Mate is its gargantuan 6.1-inch IPS display which cirously enough is only 720p. The Mate is a seriously beefy phone, and although it fit in my hand I think manipulation does require two hands in this case.

The Mate has an 8 MP rear facing camera, 1 MP front facing camera, 802.11a/b/g/n, BT 4.0+LE, and runs Android 4.1. Just like a bunch of prior Huawei phones, the Mate also is pentaband WCDMA (850/900/1900/2100/AWS) courtesty Intel's incredibly ubiquitous XMM6260 baseband and RF combo. Battery is a respectable 4050 mAh (I'm guessing at 3.7V, so just shy of 15 watt-hours).

Huawei Ascend D2

The phone from Huawei Device I'm probably most interested in is actually the Ascend D2, which a much more modestly sized device with a 5-inch display, running Android 4.1. The D2 departs from the Mate by packing a much nicer speclist on paper. That 5-inch display is 1080p, rear facing camera is 13 MP, front facing is 1.3 MP, there's 2 GB of LPDDR2, 32 GB of internal storage, and the device is IPX5/4 dust and water resistent respectively.

What's most interesting to me however is that this is Huawei Device's first smartphone with both their own SoC (again the K3V2 A9MP4 + Vivante GPU), but more interestingly a Balong V7R1 baseband for cellular connectivity. I've heard of the Balong 710 which is Huawei's LTE Category 4 baseband, but am not sure whether V7R1 refers to this part or something else which only tops out at WCDMA. Huawei clearly feels strongly about its modem, touting their DRX (Discontinous Recieve) and QPC (Quick Power Control) features quite vocally. Either way this is a very big interesting step for Hisilicon and Huawei to take and I very much want to play around with the Ascend D2 at some point. 

Industrial design on the D2 clearly borrows from some other devices, but construction felt very impressive and solid. It has been clear for some time that Huawei can craft devices of their own at very competitive price points without sacrificing design. 



AMD Releases Full Product Specifications For Radeon HD 8000M Series

Posted: 07 Jan 2013 08:15 PM PST

Along with their announcement of their annual desktop GPU rebadge, AMD has also released the full product specifications for their new Radeon HD 8000M mobile GPUs. These GPUs were first announced back in December, but at the time AMD only gave us vague descriptions of each series with little-to-no information on the individual SKUs. Now with CES kicking into gear, we have the individual SKU information on hand.

AMD Radeon 8800M Series GPU Specification Comparison
  AMD Radeon HD 8870M AMD Radeon HD 8850M AMD Radeon HD 8830M
Was Variant Of 7800M Variant Of 7800M Variant Of 7800M
Stream Processors 640 640 640
Texture Units 40 40 40
ROPs 16 16 16
Core Clock 725MHz 575-725MHz 575MHz
Boost Clock 775MHz 625-775MHz 625MHz
Memory Clock 4.5Hz GDDR5 / 2GHz DDR3 4.5Hz GDDR5 / 2GHz DDR3 2GHz DDR3
Memory Bus Width 128-bit 128-bit 128-bit
VRAM 2GB 2GB 2GB
FP64 1/16 1/16 1/16
Transistor Count 1.5B 1.5B 1.5B
GPU Heathrow Heathrow Heathrow
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm
Architecture GCN GCN GCN

Starting at the top is the 8800M series, which is composed of a number of not-quite rebadges of 7800M parts. All of these GPUs are based on the Heathrow GPU (desktop name: Cape Verde), packing 640 Stream Processors along with 40 texture units and 16 ROPs. This is paired with 2GB of memory on a 128bit bus, with AMD using both GDDR5 and DDR3 depending on the specific SKU.

Other than being minor variations on existing AMD GPUs, these products also enable AMD’s Boost Clock technology for the first time in a mobile part. Boost Clock allows the GPU to turbo up to a higher clockspeed so long as the GPU is under its power and temperature limits, allowing AMD to scrape every bit of thermal headroom from their GPUs. This technology has been available on certain desktop GPUs for more than half a year, but this is the first time we’ve seen it on a mobile part. Notably, NVIDIA’s mobile parts do not feature NVIDIA’s equivalent technology (GPU Boost) despite the fact that they introduced the technology on the desktop first, so AMD is ahead of NVIDIA in this regard.

AMD Radeon 8700M Series GPU Specification Comparison
  AMD Radeon HD 8790M AMD Radeon HD 8770M AMD Radeon HD 8750M AMD Radeon HD 8730M
Was New New New New
Stream Processors 384 384 384 384
Texture Units 24 24 24 24
ROPs 8 8 8 8
Core Clock 850MHz 775MHz 620-775MHz 650MHz
Boost Clock 900Mhz 825MHz 670-825MHz 700MHz
Memory Clock 4.5GHz GDDR5 4.5Hz GDDR5 4GHz GDDR5 / 2GHz DDR3 2GHz DDR3
Memory Bus Width 128-bit 128-bit 128-bit 128-bit
VRAM 2GB 2GB 2GB 2GB
FP64 1/16 1/16 1/16 1/16
Transistor Count ? ? ? ?
GPU Mars Mars Mars Mars
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Architecture GCN GCN GCN GCN

Below the 8800M series is the 8700M series, marking the introduction of AMD’s new Mars GPU. This is a 4th (and presumably final) first-generation GCN GPU, packing 384 stream processors, 24 texture units, and 8 ROPs. The 8700M is connected via a 128bit memory bus to 2GB of either GDDR5 or DDR3 depending on the SKU, and like the 8800M series also features AMD’s Boost Clock. Interestingly, all Mars products only offer a PCIe 8x bus instead of the GPU industry standard 16x. The performance difference on these lower-performance parts should be miniscule, but it’s an example of one of the ways AMD was able to achieve their smaller die size by reducing the number of pins that needed to connect to the GPU.

AMD Radeon 8600M/8500M Series GPU Specification Comparison
  AMD Radeon HD 8690M AMD Radeon HD 8670M AMD Radeon HD 8590M AMD Radeon HD 8570M
Was New New New New
Stream Processors 384 384 384 384
Texture Units 24 24 24 24
ROPs 8 8 8 8
Core Clock 775MHz 775MHz 620MHz 650MHz
Boost Clock 825MHz 825MHz 700MHz 700MHz
Memory Clock 4.5GHz GDDR5 2GHz GDDR3 4.5GHz GDDR5 2GHz DDR3
Memory Bus Width 64-bit 64-bit 64-bit 64-bit
VRAM 1GB 1GB 1GB 1GB
FP64 1/16 1/16 1/16 1/16
Transistor Count ? ? ? ?
GPU Mars Mars Mars Mars
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Architecture GCN GCN GCN GCN

Finally we have the 8600M and 8500M series. These are essentially cut-down Mars parts, halving the memory busses from 128bits wide to 64bits wide, and should significantly castrate performance in the process. The feature set and functional unit counts are otherwise identical to the 8700M series. The difference between the 8600M and 8500M series appears to come down to solely clockspeeds; the 8600M parts are clocked up to 775MHz, while the 8500M are clocked up to 620MHz/650MHz.



AMD’s Annual GPU Rebadge: Radeon HD 8000 Series for OEMs

Posted: 07 Jan 2013 07:20 PM PST

In an effort to mimic the model year nature of cars and other durable goods, in recent years PC OEMs have increasingly moved to updating their wares both on a technical basis and on a calendar basis. Of course with the technical cycles being 15-18 months as opposed to a 12 month calendar cycle, this means that OEMs are often put into a position where they’re doing their yearly refresh in the middle of a technical cycle, and 2013 is no different. This of course gives rise to the annual rebadge cycle that we have become familiar with over the years.

We’ll see a number of “new” desktops and laptops at CES this year. But along with rebadging the systems themselves, the pressure to rebadge has been pushed down to the component suppliers, which means that powering these “new” systems we’ll see a number of “new” components. In the GPU world both AMD and NVIDIA make an annual event of this, which for market reasons are roughly timed to coincide with CES.

Kicking the GPU rebadge cycle off this year is AMD, who along with their press conference today also pushed out their rebadges. Let’s jump right into the thick of things.

Desktop

Because the rebadge cycle is OEM driven, rebadging is typically focused exclusively on OEM parts, and this year is no exception. The Radeon HD 7000 series isn’t going anywhere in the retail market, but in the OEM market where OEMs are demanding parts with higher numbers, the entire Radeon family from top to bottom is getting rebadged. This means everything from the powerhouse Radeon HD 7970 GHz Edition to the diminutive (and ancient) Radeon HD 5450 are getting 8000 series product designations. AMD to their credit has kept their retail desktop lineup consistent in naming and features, but with the OEM lineup this has gone completely out the window.

AMD Desktop Radeon 8000 OEM Series GPU Specification Comparison
  AMD Radeon HD 8970 OEM AMD Radeon HD 8950 OEM AMD Radeon HD 8870 OEM AMD Radeon HD 8760 OEM AMD Radeon HD 8740 OEM
Was 7970 Ghz Edition 7950 W/Boost 7870 7770 7750-900
Stream Processors 2048 1792 1280 640 512
Texture Units 128 112 80 40 32
ROPs 32 32 32 16 16
Core Clock 1000MHz 850MHz 1000MHz 1000MHz 900MHz
Boost Clock 1050MHz 925MHz N/A N/A N/A
Memory Clock 6GHz GDDR5 5GHz GDDR5 4.8GHz GDDR5 4.5GHz GDDR5 4.5GHz GDDR5
Memory Bus Width 384-bit 384-bit 256-bit 128-bit 128-bit
VRAM 3GB 3GB 2GB 1GB 1GB
FP64 1/4 1/4 1/16 1/16 1/16
Transistor Count 4.31B 4.31B 2.8B 1.5B 1.5B
Board Power <250W <200W <175W <110W <75W
GPU Tahiti Tahiti Pitcairn Cape Verde Cape Verde
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Architecture GCN GCN GCN GCN GCN

AMD Desktop Radeon 8000 OEM Series GPU Specification Comparison
  AMD Radeon HD 8670 OEM AMD Radeon HD 8570 OEM AMD Radeon HD 8400 OEM AMD Radeon HD 8350 OEM
Was New New 6450 5450
Stream Processors 384 384 160 80
Texture Units 24 24 8 8
ROPs 8 8 4 4
Core Clock 1000MHz 730MHz 625-875MHz 650MHz
Boost Clock ? ? N/A N/A
Memory Clock 4.6GHz GDDR5 4.6GHz GDDR5 / 1.8GHz DDR3 3.6GHz GDDR5 / 1.8GHz DDR3 <=1.8GHz DDR3
Memory Bus Width 128-bit 128-bit 64-bit 64-bit
VRAM 2GB 2GB 512MB/1GB 512MB/1GB
FP64 1/16 1/16 N/A N/A
Transistor Count ? ? 370M 292M
Board Power <75W <50W <35W <25W
GPU Oland Oland Caicos Cedar
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 40nm TSMC 40nm
Architecture GCN GCN VLIW5 VLIW5

The OEM 8900 series are rebadges of the 7970GE and 7950 w/Boost respectively. Meanwhile the sole 8800 part, the 8870, is a rebadge of the 7870. Further down the list the 8700 series is composed of a rebadged 7770 and 7750-900 (which never saw a proper launch outside of China).

Farther down the lineup still, we actually see a break from rebadging with the introduction of new desktop parts. AMD’s recently announced Oland GPU, which are the very last members of the first generation of the GCN family (and not members of AMD’s forthcoming refresh), will be joining AMD’s OEM desktop lineup as the 8670 and 8570. With only 384 SPs these budget GPUs are not particularly potent, and we wouldn’t at all be shocked if these GPUs never come to the retail desktop market. The real question right now is where they stack up against iGPU solutions such as Trinity’s HD7600 GPUs or Intel’s HD4000, or NVIDIA’s equally low-end desktop GK107 cards like GT 640 and GTX 650.

Finally at the bottom of AMD’s OEM 8000 series stack are some of the oldest AMD GPUs still in production, and decidedly not GCN parts. The 8400 series is a rebadge of various configurations of the Radeon HD 6450  (Caicos GPU, VLIW5). Meanwhile the 8300 series is a rebadge of the ancient VLIW5 Cedar GPU, first introduced in 2010 as the Radeon HD 5450. Frankly it’s not at all clear at this point in time just what the purpose of these final rebadges are, as these cards are slower than a good iGPU. APAC markets are even more heavily weighted towards budget components than the North American market already is, so it’s quite likely that these cards are meant to fill APAC-specific product needs.



Qualcomm's Next-Gen Krait 400 & Krait 300 Announced in Snapdragon 800 & 600 SoCs

Posted: 07 Jan 2013 05:53 PM PST

We've been hinting at this for a while, both on the Podcast and in our most recent power analysis piece, but today it's very official: Qualcomm is announcing the next two versions of its Krait architecture.

Krait is the codename for Qualcomm's custom ARMv7 microprocessor. The 3-wide out-of-order design dominated the smartphone landscape since its introduction last year. Unlike what we saw with the Scorpion/Krait transition, Qualcomm is going to keep Krait fresh by more frequent updates.
 
The first two updates come today: Krait 300 and Krait 400.
 

Krait 300

 
In usual Qualcomm fashion, we're missing good depth on exactly what these new revisions deliver. This is one area where Qualcomm really needs to emulate Intel: we know more about Haswell than we do about the original Krait.
 
That being said, here's what we do know. Krait 300 is still built on TSMC's 28nm LP process, just like the original Krait. The pipeline remains unchanged, but Qualcomm is able to squeeze out higher clocks out of the core. It's unclear whether we're simply talking about voltage scaling or a combination of that and improvements to timing, yields and layout. Whereas the current Krait core tops out at around 1.5GHz, Krait 300 will run at up to 1.9GHz. 
 
Another big addition to the architecture is Krait 300 now features a hardware data prefetcher that preemptively grabs data out of main memory and brings it into L2 cache. The original Krait core had no L2 prefetchers.
 
Single threaded IPC improvements are the name of the game with Krait 300 and like all good evolutions to microprocessor architectures, the new Krait improves branch prediction accuracy. Since there's no increase to pipeline depth, improved branch prediction directly results in improved IPC (and better power efficiency).
 
Both Qualcomm and ARM have been very vague about what types of instructions can be executed out of order, but Krait 300 can execute more instructions out of their original program order. Building a robust OoOE (Out of Order Execution Engine) is very important to driving higher performance, and being able to reorder more types of instructions directly impacts single threaded performance.
 
Krait 300 now supports forwarding between pipelines, although it's not clear whether or not the previous architecture lacked any ability to forward data between stages. 
 
Finally Krait 300 improves FP and JavaScript performance. Once again, it's not clear how. I've asked Qualcomm whether there have been any changes to the execution units in Krait 300 to enable these improvements. In general I believe we're looking at around a 15 percent increase in performance at the same clock frequency, for a jump of 20 to 30 pecent overall with the clock increases. This isn't necessarily enough to close the gap between Krait 300 and ARM's Cortex A15, however Krait 300's power profile should be much better. Compared to Atom, the Krait 300 improvements should be enough to at least equal performance if not surpass it, but not necessarily significantly.
 

Krait 400

 
If Krait 300 is the new performance mainstream successor to Krait, Krait 400 is the new ultra high end part. Using TSMC's new 28nm HPm process (High-K + Metal Gate, optimized for low power at peak performance), Krait 400 can run at up to 2.3GHz. The 400 series core inherits all of the improvements from Krait 300 but adds a couple more. 
 
The move to 28nm HPm necessitates a redesign of circuits and some relayout, but Qualcomm also improved the memory interface on the core. Krait 400 enjoys lower latency to main memory and a faster L2 cache.
 

The New Snapdragons

 
The new Kraits will find their way in new Snapdragon platforms, now numbered 200, 400, 600 and 800 (the old S1 - S4 labels are gone). As always, higher numbers mean better performance but you'll still need to rely on the internal part numbers to know what's really inside.
 
Today Qualcomm announced the Snapdragon 800, which implements four Krait 400 cores running at up to 2.3GHz, an Adreno 330 GPU and Qualcomm's 3rd generation LTE baseband (9x25) all on a single die. Snapdragon 800 is the part formerly (or still internally) known as MSM8974 which we've seen rumblings about numerous times. 
 
Qualcomm tells us that the Adreno 330 will offer roughly 50% more graphics performance over Adreno 320, and an almost 2x increase in compute performance.
 
The integrated 9x25 3rd generation LTE baseband enables support for UE Category 4 LTE with up to 150Mbps downstream, this is the same IP block as in MDM9x25, and likewise MSM8974/Snapdragon 800 will be available in all the usual variants (CDMA2000/WCDMA/LTE, WCDMA/LTE, and finally no modem).
 
Snapdragon 800 also integrates 802.11ac baseband, a new feature of modern Qualcomm SoCs, just like 8960 and the previous S4 family.
 
Snapdragon 800 also includes a 2x32 LPDDR3 memory interface.
 
On the video/decode side, the SoC supports encode/decode of 4K HD content at 30 fps.
 
Also being announced today is the Snapdragon 600. This part integrates two Krait 300 cores running at up to 1.9GHz. Adreno 320 handles GPU duties, although with an increased clock speed. Compared to the current Snapdragon S4, the 600 is expected to improve performance by up to 40% if you combine IPC and frequency increases. 
 
The new Snapdragon 600 is also known by the part number APQ8064T, and was formerly known as the Snapdragon S4 Pro.
 

Final Words

 
Qualcomm really is the one to beat when it comes to smartphone SoCs. Its excellent baseband integration combined with a very good balance of power and performance on the CPU/GPU side make for a platform that's difficult to outperform. 
 
With Krait 300/400, Qualcomm is really evolving its Krait architecture the right way. The update comes at the right time after the original Krait, and improves performance in the right way. A religious focus on improving single threaded performance, generation over generation, without blowing through your power budget is the only way to do this. Qualcomm gets it.
 
Krait was good, but Krait 300/400 are likely going to continue to carry that flag through 2013. More importantly, Qualcomm has hinted numerous times that it has a "pipeline of Kraits" lined up for the future. 
 
In tablets and larger devices are really where Qualcomm will have its work cut out for it. Between Intel's x86 offerings and ARM's Cortex A15, Qualcomm's strengths still apply - but they're going to face more strenuous competition.
 
Today's announcements are a welcome update. Qualcomm is gearing up for a war and is definitely making the right moves. If it can keep up this aggressive cadence, Krait can easily become a fixture in the ultra mobile space. 



AMD CES 2013 Press Event Live Blog

Posted: 07 Jan 2013 03:05 PM PST

We're live at the AMD CES 2013 press event! Keep your browser parked here for live updates!



Netgear Selects Broadcom HPAV Solution for Powerline Products

Posted: 07 Jan 2013 02:00 PM PST

HomePlug and G.hn are tussling it out to emerge as the de-facto powerline standard, but HomePlug has enjoyed a lot of success as the incumbent. In the North American market, Intellon was the pioneer. They were later acquired by Atheros, who were then acquired by Qualcomm. Almost all PLC (Powerline Communication) devices shipping directly to consumers so far have been based on Intellon / Atheros chipsets. One of the exceptions was the Belkin's PWLAV500 solution based on Gigle Semiconductor's silicon. Gigle was later acquired by Broadcom.

Netgear has been one of the leading vendors of powerline communication products, and so far, they have all been based on Atheros silicon. Today, Broadcom announced that the latest Netgear powerline adapters XAV1201, XAV1301, XAV1401, and XAV1601 are all powered by the BCM60321.

It must be noted that the BCM60321 is a HPAV 1.1 solution and provides PHY rates of up to 200 Mbps. The BCM60321 is a monolithic PLC chipset with an integrated AFE (Analog Front End) manufactured in a 40nm process. This should lower the BOM cost and enable a cheaper powerline adapter for the consumers. Interoperability with all existing PLC adapters (based on Qualcomm Atheros's firmware version 3.1 or later) is guaranteed.



Hands on With NVIDIA's Project Shield - Update: Now With Video

Posted: 07 Jan 2013 01:11 PM PST

Anand and I just spent some time playing with NVIDIA's recently announced Project Shield, which is NVIDIA's portable handheld gaming device home to the also just-announced Tegra 4 SoC (4x ARM Cortex A15). I came away pretty impressed with Project Shield ergonomics after initially going in very skeptical (like GPU editor Ryan Smith) about how 38 watt-hours of batteries, a display, Tegra 4, and all the accompanying hardware could possibly feel comfortable stuffed into what boils down to a game controller.

First off, Shield isn't nearly as heavy as I thought it would feel in the hands. I expected Shield to feel very dense, instead the device doesn't feel much different from a wireless Xbox 360 controller with the rechargeable battery pack. I made a point of holding onto the Shield the entire time in a few different positions and didn't feel my hands or arms fatiguing at all. NVIDIA carefully positioned the batteries right around the palm areas, so the heaviest parts of the device are right in-axis with ones palms and arms rather than being elsewhere that would torque one's grip. There's something about the construction that feels balanced and masks the fact that this is actually a sizeable piece of hardware. Shield will have a soft touch bottom and different top texture finish, this first version we played with did not.

NVIDIA was eager to point out that both the D-Pad, triggers, and bumpers are all in the process of being tweaked and tuned, and that the spring preloads and click points on the Shields we played with were nowhere near final. This is good to hear as the D-Pad at present definitely needed to be less mushy and more communicative, we're told this will be replaced with a much more responsive one before Shield is finalized. I didn't have a problem with the analog sticks but would love to feel alignment nubs or texture on the domes. 

My biggest thoughts were framed around the placement of the two analog sticks, which at present is places the tops of them in the plane of the buttons and D-Pad. This initially felt very alien, until I realized this is done because the display needs to close shut (the analog sticks would otherwise protrude through the screen) but felt a little bit awkward. I'm used to the 360 controller personally, which has analog sticks that protrude above the plane of the buttons. I can see one getting used to this as it feeling awkward is simply an artifact of my prior exposure and trained response to the 360 controller. 

The 720p 5-inch "retinal" display indeed exceeded my visual acuity and was hard to pick out individual pixels on, though I still think there's too much bezel on Shield that should be taken up with more display. Something like a 5.5" 1080p display would make much more sense, but that's probably reserved for another iteration. The biggest concern is how smaller features on PC games played back on the Shield would be difficult to pick out. NVIDIA claims it will mitigate this by working with game developers on appropriate titles or offering a pinch to zoom feature to let users read small elements. Obviously some games with lots of text (they offered the example of EVE Online) can't possibly work perfectly, but I had no problems playing a few levels of Assassins Creed 3 and Black Ops 2 on Shield. I could see some H.264 artifacts while playing on Shield, higher bitrates could solve that problem easily. 

Wireless connectivity on the Shield is courtesy Broadcom's latest BCM43241 which is 2x2:2 802.11a/b/g/n, which is the right thing to do in a platform that so strongly leverages wireless display and control. Responsivity while playing PC games on the Shield was extremely good, I couldn't detect any lag. 

At the front of the Shield is a small gap and grille, it turns out that heatsink in the NVIDIA press event video was in fact real, as air is drawn from the front of the Shield, over the heatsink, and exhausted out the back. There is indeed a fan inside, albeit a small one. NVIDIA says it won't come on all the time during normal use, and after playing a few Android games natively (Hawken) on the device this seemed to be the case. One Shield left in the sun did have the fans kick on though, which are essentially inaudible. I didn't actually feel Shield get hot at all during use, but that heatsink wasn't just for show in the press event video/demo, it's real.

At the very back are the microSD, microHDMI, microUSB, and headphone jack. I'm told these are also changing slightly and would honestly like to see the headphone jack come around to the front. 

What really struck me about the Shield was how very far from ergonomic the device appears, and yet how surprisingly comfortable it is to hold. NVIDIA nailed the underside ergonomics almost perfectly, there's a small ledge for your fingers to rest on, and the palm cups are indeed reminiscent of the 360 controller. 

Android 4.2.1 on the Shield felt extremely responsive and fluid. I am very impressed with browser scrolling in both Chrome and Browser.apk, the latter of which is now a huge optimization target for NVIDIA. The rest of the UI was also very fluid. I should note that NVIDIA is not allowing benchmarking at present, so we can't say anything but just subjective impressions about Tegra 4 performance.

We recorded a video of gameplay on the Shield and are uploading it as fast as CES connectivity will afford. Edit: Videos are now up. 



Netgear at CES 2013: Connected Entertainment Solutions (Google TV and More...)

Posted: 07 Jan 2013 11:00 AM PST

Netgear's press conference at CES 2013 is kicking off as I write this, and the focus this time around seems to be heavily in the connected entertainment solutions area. We have already seen Netgear's Google TV device pass by the FCC, and the device (NeoTV PRIME GTV 100) is officially launching today for $130. This is $30 more than Vizio's Google TV device and we are very interested in checking out what the extra $30 nets the consumers. The aVia media player app comes pre-installed, and it should be possible to install other market apps also.

Existing NeoTV players (NTV300, NTV300S and NTV300SL) are also being updated with a Slinglayer app which enables users to enjoy live TV with placeshifting.

In an interesting move, Netgear has teamed up with Digital Keystone to deliver service providers with a platform to provide Neomedia, a universal media gateway technology which doesn't require middleware in the consumer premises equipment, yet manages to deliver broadcast and personal content to multiple screens. The important capabilities include in-home and cloud-based access to live and recorded linear TV services, secure playback on retail and thin IP clients, aggregation of live TV, VoD and network DVR services. We will try to learn more about this platform when we meet up with Netgear later this week at CES.

Netgear purchased VueZone, a consumer IP camera vendor, last year. Today, a new product has been introduced under the VueZone brand, the VZCN2060. The add-on night vision camera consist of two parts, an AC-powered IR lamp along with a wire-free infrared camera and will retail at $130. Home monitoring solutions using IP camera are taking off in a big way, and Netgear's wire-free solution with VueZone cameras [ www.netgear.com/vuezone ] is quite innovative.

Netgear is also introducing some networking equipment at the show. These include the WN3500RP dual-band Wi-Fi range extender (wall plug edition) which also has support for AirPlay / DLNA music streaming when connected to speakers, the D6200 WiFi DSL modem router with 867 Mbps 802.11ac and 300 Mbps 802.11n and an integrated DSL modem and a Music Extender kit (XAUB2511) for playing music using AirPlay from any computing device to any stereo system. Stay tuned for more information after we get some hands-on time later this week with what Netgear has to showcase at CES.



Hands On with ASUS' Transformer All-in-One (P1801)

Posted: 07 Jan 2013 10:47 AM PST

ASUS showed us a near production-ready sample of its Transformer All-in-One PC. The machine features a multitouch 18.4" 1080p IPS display driven by two independent systems. In the base, a Core i5/i7 machine running Windows 8, and in the display itself there's a Tegra 3 Android system. There's a physical button on the right side of the system that switches display inputs between the two systems. Since you're just changing display inputs the switching lag is similar to how long it takes to switch between systems on a multi-input monitor.

You can undock the 18.4" display, transforming it into a giant Tegra 3 Android tablet (running 4.1.1). When undocked you're also able to wirelessly stream (over Miracast or some alternative) the Windows 8 base's display over to it, making it a giant Windows 8 tablet if you prefer. Both the base and the display/tablet have their own storage obviously. 

The tablet experience is surprisingly reasonable, although cradling an 18.4" tablet can be an issue over the long haul. Thankfully ASUS equipped the display/tablet with an elegant kickstand. With the display detached, you can also use the Core i5/i7 base to drive a separate external display if you'd like.

The display has an integrated battery which ASUS claims will power the tablet for around 4 - 5 hours. 



LG bringing UltraHD, OLED and more Google TVs to your living room

Posted: 07 Jan 2013 10:24 AM PST

LG President and CEO Wayne Park kicked off their CES press conference by reminding us that the brand’s presence in North America was launched a decade ago at CES. The company itself had been a presence in consumer electronics for sometime, but the LG brand was brought here just in time to catch the mounting wave of smartphones and other smart devices, and though we’ve seen them falter at times, they’ve consistently gained market share in some of the largest consumer electronics categories. In 2012, they had several important design wins, from the LG Google TV, to the Optimus G and Nexus 4. Worldwide they also saw a number of big wins in the mobile space with their phone/tablet mash-up the Optimus Vu and their L-series of mid-range smartphones. So, what do they have in store for us in 2013? 

The answer is a bit guarded, sadly. No new LG Mobile announcements are in store for CES; instead, we were reminded of the success they’ve had with the aforementioned G and Vu, and their successful collaboration with Google to produce the Nexus 4. They emphasized that their approach to software design is aimed at improving user experience through multitasking and feature adds that target a user’s interests. We’ve always found their skins a bit of an acquired taste, and there is pretty much universal preference for stock Android amongst AT editors, but we’ll give them some credit for trying. 
 
 
The other Android news involved their continued collaboration with Google for their Google TV powered Smart TVs. When the Google TV program was announced, we expected it to function as a tightly controlled ecosystem, with hardware and software specifications strictly laid out and enforced, and partners differentiating with changes in form factor and control paradigms. Think Windows Phone and not … you know, Android. What we’ve seen, though, has been something unique, and we expect this to change continue. No longer is Google TV a strictly defined thing, rather it is the implementation of search, recommendations and Youtube to build a television experience that seeks to break the old paradigm and provide a more “when you want it, how you want it” experience. And featured squarely in every Google TV device, will be Google’s content portal, YouTube. The success of Google TV will be up for discussion for quite some time. But Google and LG are putting some new tools to use to further their efforts in 2013. 

LG’s stable of motion control techniquesis expanding to provide even more control paradigms. Last year at CES, LG previewed a small gaming device that used cameras to implement Wii like motion control for causal gaming. The research they put in to these efforts are being put to use n their 2013 Google TV line-up for hand based gesture controls, first by allowing you to select a channel by tracing the digits with your finger. Despite the preponderance of video streaming options, the vast majority of television watching still happens through traditional broadcast and cable providers. So maybe there’s some dividend to be paid to providing a more “natural” way to switch between channels, than simply punching buttons on a remote. This will be in addition to the use of the Magic Wand remote control for pointing and click actions. LG will also be refining the remote by bringing voice control into the remote itself. The impact on battery life could be considerable, since presumably an SoC will be housed in the device, and the results will be transmitted through Bluetooth to the set for action. The added benefit to this method is a move from command driven voice control (“Watch ESPNU”) to a natural language method (“I’d like to watch some college football.”). 
 
These are still early days in the evolution of the television, revolutionary talk not withstanding. Whether any of the techniques LG is implementing today will have any profound impact on the market is something to assess in years, not today or tomorrow, but we’re always glad to see someone trying, rather than leaving a market stagnant. 

G's 84-inch UltraHD TV at CES

We don’t cover televisions in any appreciable way, in part because they are commodities in the very worst way, with a tendency to drive to the bottom of the market. Our readers, though, will recognize some of the key tech being introduced this year. This year LG is bringing two TV technologies to the market, and realistically, these may be the first two technologies to justify a new set in a long time. 

First, we have the long (long) awaited introduction of “mass market” OLED televisions. LG will bring its 55” OLED 1080p 3D Smart TV to the US this March, for a recommended price of $12,000. While that cost prohibitive price will keep it off shelves at your local Walmart or Best Buy, it is distinctly in the realm of reason for A/V enthusiasts. If LG’s OLED fabrication techniques are ready for that market, then it is possible we could see similar sets landing on big box store shelves within a few years. 
 
The other big reveal for CES 2013 is production sales of their 84” UltraHD sets, and two new sizes (55 and 65”) being introduced. Pricing wasn’t provided, but sales should start in March.

G's latest ultra-widescreen display

Lastly, we’ll take a quick look at LG Display’s latest plans for computer displays. LG has proven that they are willing to push themselves when it comes to risky design and feature choices, and they’re planning on continuing that push in 2013. LG will expand their line-up of 21:9 ultra-widescreen displays, providing added screen real-estate for professionals and gamers alike. They’ll also be expanding their 2560x1440 (WQHD) offerings, something that’s sure to curl the toes of our Ian Cutress, our Senior Motherboard Editor. The technologies that are bringing HiDPI displays to our phones and tablets are sure to make an impact in the computer display space, and this could be the year that we usher in an era of similar computer displays while bidding adieu to the cheap 1080p panels that have plagued us for so long. Touch displays will also be on offer from LG, though some of the samples we saw at the brief demo were a bit glossier than I would consider ideal. 
 
The show is just getting started, we’ll try to get some more hands-on time with all that LG has at the show shortly, for now, stay tuned. 



Asus Announces ROG ARES II Video Card: Dual Radeon HD 7970GE On A Single Card

Posted: 07 Jan 2013 09:00 AM PST

Dual GPU video cards are nothing new, but there are very few companies that do it with the level of flair (and dare we say overkill) of Asus. Along with their standard reference-derived designs for products such as the GTX 690 and Radeon HD 6990, the company also produces one or two ultra-luxury custom designs that push the envelope in terms of performance, features, power, and size. These cards have been released under the ARES (AMD) and MARS (NVIDIA) brands, with the most recent entry being the dual-GTX580 based MARS II in 2011.

Now with CES upon us Asus has announced their next custom dual-GPU card, the Asus ARES II. Like the Radeon HD 5870 based ARES (I), the ARES II is effectively two Radeon video cards on a single board. Specifically, Asus is building a single card Radeon HD 7970 GHz Edition crossfire solution, packing two overclocked 7970GE GPUs and 6GB of memory (3GB per GPU) on to a single card. Clockspeeds are at 1050MHz for the core and 6.6GHz for the memory, 50Mhz(5%) and 600MHz (10%) over a stock 7970GE respectively.

As to be expected, power and cooling requirements for such a card are quite high, which is where Asus’s customizations and expertise come into play. Feeding the beast will be 3 8pin PCIe power connectors, and with another 75W from the PCIe slot itself we’re looking at total power consumption somewhere in the neighborhood of 500W (roughly twice that of a single 7970GE). Asus uses their own Super Alloy Power delivery system here, with a total of 20 power phases between the various GPUs and memory components.

Meanwhile cooling the beast will be something even more outlandish and unique for a 1st party video card: closed loop liquid cooling augmented with standard air cooling. Rather than going with an even bigger HSF like the ARES (I) or competitor PowerColor’s Devil13, Asus is sticking with a dual-slot card and moving most of the cooling duties to a 120mm radiator. The radiator itself is a single 120mm block with a pair of fans in a push-pull configuration, very similar to aftermarket CPU coolers we’ve reviewed in the past such as the Corsair H80i. These closed loop coolers have proven to be extremely effective in the past so it will be interesting to see how Asus’s implementation plays out, particularly with the uniquely high power/heat properties of a dual-GPU video card. Meanwhile Asus also has an 80mm fan on the card itself (thereby making this a hybrid configuration) and based on their description we believe this covers cooling the memory and VRMs, while the closed loop cooler is dedicated to the GPUs.

Asus for their part points out that by going with a closed loop cooler they can keep the size of the card down, making it compatible with more cases. Certainly at 11.8” long and two slots wide the card is by no means small, but this configuration does mean that it should be possible to squeeze the card into some mATX enclosures that aren’t accessible to triple-wide cards. Alternatively it will be much easier to set up a pair of these cards in Crossfire for a 4-way GPU configuration, this being the bane of triple-wide cards on most ATX motherboards.

Finally, like past ARES and MARS cards, the ARES II is once again a limited edition product with Asus constructing just 1000 cards, each numbered and sold in a collectable fashion. Unfortunately Asus hasn’t announced specific pricing or a launch date, but based on their prior cards $1500 is not out of the realm of possibility.

  Asus ARES II PowerColor Devil13 HD7990 Radeon HD 7970 GE
Stream Processors 2 x 2048 2 x 2048 2048
Texture Units 2 x 128 2 x 128 128
ROPs 2 x 32 2 x 32 32
Core Clock 1050MHz 925MHz 1000MHz
Boost Clock 1100MHz N/A 1050MHz
Memory Clock 6.6GHz GDDR5 5.5GHz GDDR5 6GHz GDDR5
Memory Bus Width 2 x 384-bit 2 x 384-bit 384-bit
VRAM 2 x 3GB 2 x 3GB 3GB
TDP A Lot A Lot 250W
Transistor Count 2 x 4.31B 2 x 4.31B 4.31B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm



ASUS' Qube with Google TV - Update: Hands On

Posted: 07 Jan 2013 08:00 AM PST

It looks like a lot of companies are going to be dipping their toes in the TV waters this year. ASUS just announced Qube, its first Google TV device. We're on our way to meet with ASUS but for now we do have a shot of the Boxee Box-like Qube by ASUS. The Qube apparently integrates voice and motion control, although it's not immediately apparent how the latter works (it seems to be via an included remote). The name Qube comes from the cube UI that users can control with the remote or an Android app.

We just spent some time with the device, which should launch in February. The Qube uses the new Marvell ARMADA 1500 platform that other Google TV platforms use now. The Qube user interface is pretty straight forward, with each face of the Qube home to a different selection of apps/tasks. Transition animations weren;t really smooth, at least on this demo system. 

The remote features a full QWERTY keyboard on the flip side, sort of like the Boxee Box remote but with a much better feel. The front of the remote has a dedicated Netflix button as well as navigation buttons as well. The remote has a built in mic for voice commands, and it also features an integrated gyro for motion sensing controls. 

The chassis itself features two HDMI ports allowing for set-top box passthrough. There's an ethernet port, 2.4GHz WiFi and a USB port as well. The Qube will ship with an IR blaster with a total price somewhere in the $100 - $120 range. 



AT&T 2013 Developer Summit Keynote: Live Blog

Posted: 07 Jan 2013 07:35 AM PST

While 2013 CES is going full tilt, AT&T also holds its developer event and keynote at the Palms, in almost the same place where we were last night for NVIDIA. We just took our seats for the AT&T 2013 Developer Summit Keynote, stay here for the live blog. 



SanDisk Ultra Plus SSD Review (256GB)

Posted: 07 Jan 2013 05:00 AM PST

SanDisk is a household name as far as NAND storage is concerned. You can’t walk into an electronics retailer and not be bombarded with SanDisk made CompactFlash and SD cards. Rip open a tablet and you might find a SanDisk solution. Even the new $249 Cortex A15 based Chromebook uses a SanDisk iNAND device on-board. Similar to Intel/Micron, SanDisk and Toshiba jointly develop and manufacture their own NAND.

The brand is well known and its products are prevalent throughout the OEM space, but in retail SanDisk isn’t as strong as some of its competitors. Recently SanDisk has been hoping to change this with the release of several of its own branded SSDs: the SanDisk Ultra (SF-1200) and SanDisk Extreme (SF-2281).

Like many other SSD companies once married to SandForce, SanDisk also looked elsewhere for controllers. And like many of those same companies, SanDisk ended up at the same place: Marvell.

Today SanDisk is announcing the Ultra Plus SSD, based on Marvell’s SS889175 6Gbps SATA/NAND controller. Read on for our full review!



Audience Announces 3rd Generation Sound Processor - eS515

Posted: 07 Jan 2013 04:00 AM PST

About a month ago, Audience flew me out to their office in California to talk about a number of things. First, they offered the chance to check out anecohic rooms and listening booths used for testing and tuning smartphones equipped with Audience voice processors. Second, compare notes about testing and evaluating voice quality on mobile devices, both compared to the testing I’ve done and by wireless operators. Third, to take a look at their newest voice processor, the eS515. We’ve been covering noise rejection performance and voice quality on smartphones and tablets in our mobile device reviews, and Audience makes a line of standalone voice processors that work to improve voice quality both on the origin and endpoint of a mobile call.

First, Audience’s newest processor, the eS515, fundamentally changes itself from being just a standalone voice processor to a combination voice processor and audio codec. There’s a corresponding change in naming from voice processor to sound processor, as the eS515 takes the place of both a standalone codec and the slot otherwise taken by a standalone voice processor. This move changes Audience’s lineup from being a solution which requires an additional package (codec and voice processor) to being a solution that includes all the functions of a normal audio codec in addition to the audience noise processing infrastructure. The move allows direct access to all of the audio rails in addition to likely being a better sale to OEMs looking for a simple standalone solution. The eS515 includes a 1.13 watt class-D output speaker driver and 30mW output class-G headphone driver.

The other big news about eS515 are inclusion of some new improved audio processing features and others which move beyond just an emphasis on removing noise from calls and for ASR (automatic speech recognition or voice to text).

For a while we’ve seen smartphones shipping with two microphones, and recently some smartphones with three microphones, including the iPhone 5 and a number of prior Motorola phones. Until now, however, these implementations have used the these microphones in pairs, selecting combinations of two microphones to use at a time. In addition Audience designs with three microphones likewise used the microphones in pairs, selecting primary and secondary microphones depending on the phone’s position. eS515 is now the first to include a three-microphone algorithm for processing should an OEM choose to include a third microphone.

New features for eS515 also include de-reverb for rooms with heavy reverb, improved wideband processing for VoIP calls (up to 24 kHz, well beyond the 8 kHz for wideband cellular voice calls, 16 kHz is also supported), further improved ASR processing, and finally features for reducing noise when recording videos.

This last note is similar to what Motorola shipped on a number of former phones that leveraged the three-microphone setup. For example different scenes such as narration mode, wind reduction, and so on. The eS515 includes an interview mode called “Audio Zoom” which looks for a voice source behind and in front of the device and rejects noise elsewhere. Audience envisions a camera UI similar to Motorola’s with different audio scenes for users to choose from when recording videos.

I recorded a short video of “Audio Zoom” being demonstrated on an eS515 simulator.

After getting a design win, Audience works with handset and tablet makers to do final tuning on their devices, both after final industrial design is finished and sometimes on acoustic design before finalization. Part of this requires using special calibrated rooms to characterize the frequency response and directionality of devices. In addition Audience needs testing methodology for benchmarking its own projects.

I got to peek into Audience’s anechoic chamber and and an ETSI room as defined by EG 202 396–1 for noise suppression testing. Inside both rooms are a HATS (Head And Torso Simulator) which is instrumented with microphones for testing phones and tablets, and a controllable testing apparatus for holding the device under test and moving it through various positions.

The ETSI 202 396–1 specification defines a setup with four speakers and a subwoofer playing distractor music around the caller and a simple room layout. I plan to move our own smartphone call testing to a similar setup as well.



Zotac ZBOX ID42 Keeps 'Ion' Alive

Posted: 07 Jan 2013 03:26 AM PST

Zotac's ZBOX mini-PCs have proved popular in emerging markets. The Ion boxes based on the Intel Atom and NVIDIA GPU combination, however, have run their course. NVIDIA no longer markets GPUs for the Ion platform. Zotac has put its own twist to this effective, but low cost, platform by releasing the ZBOX ID42 / ID42 Plus with a Celeron processor and a NVIDIA GeForce 610 GPU.

Zotac's mini-PCs do have some noise issues and we will have to wait and see if the redesigned chassis (with extra 7mm thickness) of the ZBOX ID42 does anything to alleviate the problem. The unit also has two GbE ports and two Wi-Fi antenna slots. The ventilation area is also tripled compared to the previous generation chassis. The specifications of the Zotac ZBOX ID42/43 are presented below.

Product Name ZBOX ID42
SKU ZBOX-ID42 ZBOX-ID42-PLUS
Memory 2 x 204-pin DDR3 SO-DIMM slots (Up to 16GB) DDR3-1333
Hard Disk 1x 2.5-inch HDD / SSD
PLUS Configuration 4GB DDR3 / 500GB 5400RPM HDD
CPU Intel Core Processor (Celeron) (dual-core, 1.1 GHz)
GPU NVIDIA GeForce GT 610
Video Memory 512MB DDR3
Display Options HDMI & DVI
Memory Card Reader 4-in-1 (SD/SDHC/MMC/SDXC)
Ethernet 2 x 10/100/1000Mbps
WiFi Onboard 802.11n Wi-Fi & Bluetooth 4.0
USB Ports

2x USB 2.0 (rear) (high-amperage charging supported)

2x USB 3.0 ports (1 on top, 1 on front)

Audio

Onboard analog stereo high-definition audio

HDMI audio (bitstream)

Digital Optical S/PDIF

DirectX® Support DirectX® 11 with Shader Model 5.0
Other Features

Hardware Video Decode Acceleration

Blu-ray 3D support

Dual-display capable

HDCP: Yes
OS Support Windows 7 & 8 ready

Along with the ID42, Zotac is also updating their Core i3-based lineup with Ivy Bridge processors in the ID83 / ID 83 Plus. This configuration still uses the previous generation chassis.

The specifications of the ID82/83 are provided below.

Product Name ZBOX ID83
SKU ZBOX-ID83 ZBOX-ID83-PLUS
Memory 2 x 204-pin DDR3 SO-DIMM slots (Up to 16GB) DDR3-1600
Hard Disk 1x 2.5-inch HDD / SSD
PLUS Configuration 4GB DDR3 / 500GB 5400RPM HDD
CPU Intel Core i3 3120M (dual-core, 2.5 GHz)
GPU Intel HD Graphics 4000
Video Memory Shared Memory Architecture
Display Options HDMI & DVI
Memory Card Reader 4-in-1 (SD/SDHC/MMC/SDXC)
Ethernet 10/100/1000Mbps
WiFi Onboard 802.11n Wi-Fi & Bluetooth 4.0
USB Ports

4x USB 2.0 (1 front, 1 top, 2 rear) (high-amperage charging supported)

2x USB 3.0 ports (rear)

Audio

Onboard analog stereo high-definition audio

HDMI audio (bitstream)

Digital Optical S/PDIF

DirectX® Support DirectX® 11 with Shader Model 5.0
Other Features

Hardware Video Decode Acceleration

Blu-ray 3D support

Dual-display capable

HDCP: Yes
OS Support Windows 7 & 8 ready

We will be meeting up with Zotac later this week at CES. Stay tuned for further information including pricing details.



CES Gear - What's in Vivek's Bag

Posted: 07 Jan 2013 02:50 AM PST

After seeing Brian’s post on the gear he brought to CES, I decided to write my own. Forewarning: his kit is far more intense than mine. After two years of carrying a pretty heavy backpack all over the show floor, I was fed up. My focus was on outright mobility for what I took with me on the go, which meant two really key substitutions: a Windows 8 tablet for a real notebook, and a compact interchangeable lens camera for a DSLR. My first CES, I carried a 14” HP EliteBook and a Nikon D5000 (18-55 kit lens). Last year was my trusty 13” MacBook Pro and my equally trusty Nikon D7000 (18-105 kit lens). Either way, it’s a total of about 8 pounds, plus the weight of the backpack and any other miscellaneous equipment I was carrying on me (notably, an iPad the first year that I didn’t even bother with last year.)

This year, I’m going with the Samsung ATIV Smart PC as my mobile computing solution - it’s a Clover Trail W8 slate with oodles of battery life, a solid keyboard/laptop dock, a Wacom digitizer, and a total weight of 3 pounds. I have my fair share of issues with this system, but it’ll last all day, give me a solid writing tool, and run Lightroom so I can publish decent posts on the go. That’s really all I need it to do - I’m often too busy during the day between meetings to really require any heavy computational power in my travel companion. 

I’m also bringing my Samsung Series 7 Chronos. I have some fascination with this system - it’s a very good all-rounder type notebook, which goes a ways toward explaining why I’ve now bought two of them and will probably upgrade to the new one later this year. Decent amount of compute horsepower (IVB quad and a GT 640M), good keyboard, not too large, acceptable battery life, pleasing aesthetics, and good build quality. Two downers: the screen isn’t fantastic (1600x900 15.6” non-IPS), and the webcam/mic are beyond garbage. But it’s never failed me, even under some heavy torture while I used it as the primary workstation for my Master’s thesis and associated research. But I’m leaving it at the hotel this time around. My days of carrying 5 pound notebooks around the Las Vegas Convention Center are over. So over. 

Ideally, I’d carry something along the lines of a 13” MacBook Air or Zenbook Prime, which would allow me to eliminate the second system. But I went with what I had on hand, so we’ll see how well it works. If I was driving to the show, I would have brought my 27” iMac. That would have been really awesome. But it won’t fit in my carry on baggage, so it’ll have to stay at my desk. 

On a camera side, I picked up a Panasonic Lumix GF3 just before Christmas on an Amazon Lightning Deal on impulse, after a month of deliberating on whether or not to pick up a compact sidekick for my D7000. The combined weight of the D7000 body and the Tamron 17-50 f/2.8 is something on the order of 3.5 pounds. That just didn’t appeal to me from a lugging around the show floor standpoint, hence the jump into the world of Micro 4/3rds cameras. The GF3 came with a 14-42mm PZ f/3.5-5.6 pancake zoom lens out of the box, and I quickly made up my mind to replace the lens with something faster (I’ve learned my lesson as far as kit lenses, trust me). Major shoutout to my friend Tao for loaning me his 14mm f/2.5 prime lens in exchange for a bowl of pho, but I set my sights on the higher end 20mm f/1.7 prime and picked one up on Craigslist earlier today. The speed will go perfectly with the utter lack of good lighting anywhere you go at CES. I’m going to save the 14mm as a backup wide angle. The GF3 body, plus both primes, totals about a pound. Toss in a wrist strap stolen from a long discarded point and shoot, and just like that, we’re in business. 

I’m bringing an iPad mini as a quick way to browse the web and preview photos on the floor. I picked up a 30 pin to Lightning adapter to go with my old iPad Camera Connection Kit, resulting in a somewhat ridiculous looking chain of adapters, but it works. It’s light, so I’m willing to make the allowance - it’s far more convenient to pull out than any of the larger 10”+ tablets. 

From a connectivity standpoint, I will have my Nexus 4 on hand as my day to day device, on T-Mobile’s DC-HSPA+. I’m also carrying a T-Mobile Galaxy S3 with a second T-Mobile SIM in it as a dedicated hotspot device. I’m not on the ground in Vegas yet, but based on what Brian has told me, it sounds like T-Mobile is getting pretty hammered already, so I’m not that hopeful. I’m also bringing a FreedomPop WiFi hotspot, on the off chance that I’m the only person in attendance with a device that connects to Sprint’s old WiMAX network. It’s worth a shot? Maybe? We’ll see. 

Other random gadgets I’m bringing: a 6000 mAh battery for recharging devices on the go, given to me at CES last year by the friendly people at the MyCharge booth, a couple of flash drives, a mostly-empty 1TB Seagate USB3 hard drive just in case, Razer’s Orochi bluetooth mouse for the hotel, the Samsung earbuds from the Galaxy S3 box, and a Battlefield 3-edition Razer Blackshark headset for the flights. Also, a 6 port power strip from IKEA, and a bunch of cords, cables, and power bricks. 

With the noticeable slimming down of my gadget profile, I also decided to downsize the carrying case. I’ve got a fantastic High Sierra Elite series backpack (fits up to 17” notebooks) that I’m bringing everything in, including my showfloor bag. It’s an ASUS-branded Targus notebook briefcase from my much loved ASUS W7S (this is from back in mid-2007, when ASUS used to make exclusively high end computers and included accessories like cases and bluetooth mice with each one). It’s small enough that it fits inside my High Sierra backpack, and comes with a shoulder strap that isn’t pictured here. It’s tiny, it looks relatively professional, and has enough space for what I want to bring to the show floor. My concerns are two-fold: one shoulder strap versus splitting my load across two in a backpack - will it get tiring? Also, if I decide to bail on the Clover Trail experiement mid-conference, I also need to switch to the High Sierra backpack, because the Series 7 won’t fit into the briefcase. 

Estimated total weight for my entire gear bag: just over 6 pounds, which is less than just my DSLR+notebook setups from years past. I am so excited. But also a little bit terrified. Because when your most powerful on-hand computing device is running Atom and your camera looks like a glorified point and shoot, you don’t really have that much room for error. Anand said in reaction to my gear, “You are a brave man.” The line between bravery and foolishness is a thin one. Time to start praying. 



Email In: ASRock Offering Two New Motherboards – 990FX Extreme9 and FM2A85X-ITX

Posted: 07 Jan 2013 02:18 AM PST

With the release of the Vishera processors (AMD Piledriver based FX-8350), very few motherboards were brought to the market as the chips slotted directly into the AM3+ socket and 990FX + SB950 chipset.  However an email just dropped into my inbox, offering up a new AM3+ motherboard for review from ASRock – the Extreme9

With a potential mountain of knowledge to exploit since the original 990FX release, ASRock have the opportunity to use an upgraded BIOS, software, driver stack and utilities it has gained.  In the product features for this motherboard are included a total of ten SATA ports (8 SATA 6 Gbps, 2 eSATA 6 Gbps), eight USB 3.0 (4 rear IO, 2 onboard headers), an Intel NIC, Realtek ALC898 audio codec, support for 3-way CrossFireX and SLI, as well as Windows 8 fast booting options.

Also in my inbox dropped an info pack regarding an ITX motherboard for the Trinity processor platform, the FM2A85X-ITX, which looks very interesting indeed:

Even though the AMD bracket is a large space hog on any board, ASRock have decided to get around this problem on mITX by placing the memory slots above the socket and left-to-right.  Technically this is because the socket is also adjusted that way, meaning that the PCIe lanes that normally come off the bottom of the processor have to route around towards the PCIe slot.  The location of the 4-pin CPU power may be of concern, but the board offers seven SATA 6 Gbps ports on the motherboard itself, suggesting it could be handy for a home NAS type arrangement.

Both models get drizzled in ASRock features such as XFast 555 and the new X-Boost.  Neither boards currently show up on Newegg for pricing, but it should be soon.  Let me know if you might want to see any of these two on our review test beds.

 



No comments:

Post a Comment