Navigations

ChewOnTech.com is looking out for our fans to join us! us today

Google Search

Google
 

Wednesday, October 31, 2007

IBM aims to salvage silicon for solar industry

Slashdot It! It's like a Waterpik for silicon wafers. IBM has come up with a way to more efficiently recycle scrap silicon wafers owned by semiconductor manufacturers for the solar industry. The technique basically involves polishing a scrap wafer with water, an abrasive pad, and a piece of machinery ordinarily used to smooth out the pits and valleys in production chips. Rather than smooth out the surface of the wafer, the primary goal of the water polishing process is to erase any intellectual property or chip designs on the wafer. "We use it literally to scrap off the integrated circuits," said Tom Jagielski, an engineering manager at IBM working on the project.

Get daily updates

Enter your email address:

Tech Talk

Water makes the process more eco-friendly--usually, wafers get dipped in abrasive chemicals or blasted with tiny glass beads to remove circuitry. IBM estimates that around 3 million wafers get scrapped a year. If you could turn those all into solar panels, the panels would be capable of generating 13.5 megawatts of power. That would represent a small percentage of the world's solar output. Sharp, the largest solar panel maker in the world, can manufacture more than 600 megawatts of solar cells a year. But a global silicon shortage--which started after large subsidies in Germany goosed solar demand in early 2004--makes any source of processed silicon welcome. Texas Instruments, Intel, and other companies already sell their scrap wafers. TI used to unload them, garage-sale style, outside the factory. Starting in 2004, it began to sell them in a more uniform program. Now, TI sells about 1 million scrap wafers a year to solar manufacturers, which results in about $8 million in revenu Scrap wafers, more often than not, are blanks--for instance, wafers that were used to calibrate or test manufacture processes. The wafers get repolished after each test and, after a certain point, are too thin to use as test blanks anymore. Even though thin, the wafers remain thick enough for the solar industry. Jagielski said that IBM plans to share the technique with other chipmakers, but how and when the process will be shared has yet to be determined. IBM currently recycles wafers this way in its Burlington, Va., facility and will bring it to its East Fishkill, N.Y., plant. The solar shortage is expected to continue at least through next year. Although supply has increased, global demand for solar panels is expanding. Other solutions for ameliorating the problem lie in relying on dirty or less-pure silicon for solar panels. CaliSolar has come up with a way to isolate impurities in dirty silicon so that panels made from the material will still generate electricity. Ordinarily, impurities cause silicon panels to fail. Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

Handheld Supercomputer

Slashdot It! One nanotechnology researcher said supercomputers small enough to fit into the palm of your hand are only 10 or 15 years away. "If things continue to go the way they have been in the past few decades, then it's 10 years," said Michael Zaiser, a professor and researcher at the University of Edinburgh School of Engineering and Electronics. "The human brain is very good at working on microprocessor problems, so I think we are close -- 10 years, maybe 15." Zaiser's research into nanowires should help move that timeline along. For the last five years, he has been studying how tiny wires -- 1,000 times thinner than a human hair -- behave when manipulated. He explained that each such miniscule wire tends to behave differently when put under the same amount of pressure. Therefore, it has been impossible to line them up close to each other in tiny microprocessors in a production atmosphere. Zaiser said he's now figured out how to make the wires behave uniformly. He separates the interior material of the wire into distinct groups so the wire can't react as a whole. That makes it much easier to control. "It's like crowd control," he added. "If they can all go one way, you have a big mess." These nanowires will go inside microprocessors that could, in turn, go inside PCs, laptops, monile phones or even supercomputers. And the smaller the wires, the smaller the chip can be. Shrinking down the microprocessors is a big step toward shrinking down computers. Zaiser was quick to point out that his nanowire discovery won't immediately lead to the development of supercomputers that can fit in the palm of a hand or even shrink down to the size of a book of matches. In addition to smaller microprocessors, engineers will need to deal with thermal fluctuations that erupt at that size. But he does humbly admit that taming nanowires is a big step toward the goal of small supercomputers. "This will enable chips to become much smaller," he said. "Think 10 years back. You could hold [a mobile phone] the size of a telephone receiver and it didn't work so well. Today, you can fit what is a powerful computer onto a small device." Charles King, an analyst at Pund-IT, said with advances like nanowire technology continuing to come along, Zaiser's predictions for tiny supercomputers may not be so far off. And that will be a huge step for the industry, considering that not so long ago supercomputers filled up enormous rooms or even entire buildings. "Actually, what he's saying is not crazy," said King. "The wire problem was an important one. Solving that particular issue, one so fundamental, needed to be done. If he can solve that, a lot of people, a lot of companies, will appreciate that. The industry is aligning in a direction where we're going to be seeing continuing improvements and developments." Jim McGregor, an analyst at In-Stat, said he definitely can see palm-size supercomputers coming out within 10 to 15 years, but he thinks they'll be based on the basic silicon technology used today. He added that the next generation of tiny supercomputers -- made with carbon nanotubes and nanowires -- may be as far off as another 30 to 40 years, though. "You know, everybody says we'll get to a physical limit but every time we think we're going to hit it, we overcome it," said McGregor. "They end up being speed bumps instead of road blocks." King added that over the last 10 or 20 years, technology advancements have put smaller and more powerful computers into the hands of children and researchers alike. That leads him to believe that the shrinking of supercomputers is just ahead of us. "We're at the beginning of some very exciting times in supercomputing," he said. Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

Xbox 360 IPTV still coming

Slashdot It! The Xbox 360 fall update is just around the corner and speculation is rife about what features it will introduce. Among the more talked-about but still unconfirmed features is a tool that would allow parents to limit the amount of time their children can play. Another rumor claimed the fall update would finally give the 360s the IPTV functionality Microsoft promised it would have back in January at the Consumer Electronics Show. On Tuesday, Microsoft confirmed one feature would not be included in the fall update. Following the discovery of some IPTV functionalities in a recently serviced 360 by site Xbox 360 Fanboy, Microsoft said the functionality would be excluded from the console's looming firmware update. "This was an isolated incident where these features were inadvertently exposed while the customer's console was being serviced and is unrelated to the fall update," the company said in a statement e-mailed to CNET News.com sister site GameSpot. Microsoft did not comment on the specific functionalities unveiled in the leak, which included onscreen chat while viewing live TV, recorded television, and on-demand movies. However, Microsoft did confirm that the 360's IPTV functionality will be a version of its Microsoft Mediaroom software, which was unveiled in June. Microsoft's statement also raised the possibility that IPTV won't make it into 360 owners' living rooms this year. At CES, Microsoft Chairman Bill Gates said his company was planning a holiday 2007 launch for 360 IPTV. Now, Microsoft says the functionality will be in the hands of IPTV service providers by year's end--and not necessarily those of consumers. "Internet Protocol Television (IPTV) features will only be available from the Xbox 360 through a service provider who has deployed Xbox 360 with Microsoft Mediaroom (IPTV) services," read the Microsoft statement. "Xbox 360 with Microsoft Mediaroom (IPTV) will be available to service providers by the end of the year. Microsoft's IPTV service providers will ultimately determine the timing of Xbox 360 with Microsoft Mediaroom deployments." Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

Apple: 2 million copies of Leopard sold

Slashdot It! Apple said it sold more than 2 million copies of the latest version of its operating system, Mac OS X Leopard, since its release on Friday. Leopard introduces new features to Apple PCs, including automatic backup, a quick way to browse and share files over multiple Macs, and a new way to see files without opening an application, the company said in a statement. Apple had been very successful in their products this days. Keep it up Apple! Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

40-GB-Playstation 3 with 65-nm-Cell and -RSX

Slashdot It! As a note, under the heading "Sparsamere and quieter PlayStation 3" are in the current issue of c't (23/2007), which has been on sale and can be purchased under the title "Mac instead of Vista?" , further details of the new PlayStation 3 with 40-GB-Festplatte known. Consequently, the new model, which since mid-October of this year for 399, - Euro commercially available, under the hood is changing more than the official announcement made by Sony suspect. Dass das neue PlayStation-3-Modell nur noch über eine 40 GB große Festplatte verfügt, keine PlayStation-2-Abwärtskompatibilität mehr bietet und auf den Cardreader und zwei der ehemals vier USB-Ports verzichten muss, war bereits bekannt. The fact that the new PlayStation-3-Modell only about a 40 GB hard drive, no PlayStation-2-Abwärtskompatibilität more offers and to the card reader and two of the former four USB ports must renounce, was already well known. Neu sind jedoch ein deutlich reduzierter Stromverbrauch und eine geringere Lautstärke im Vergleich zu den bisherigen Modellen. New, however, a significantly reduced power consumption and a lower volume compared to the previous models. Die Kollegen von der c't gehen aufgrund eines gesunkenen Stromverbrauchs auf lediglich 135 Watt unter Vollast (bisher rund 200 Watt) davon aus, dass das neue Modell mit 65-Nanometer-Versionen des RSX-Grafikchips von Nvidia und des Cell-Prozessors ausgestattet ist, welche IBM bereits im März 2007 angekündigt hatte . Colleagues of the c't go due to a decrease in electricity consumption to a mere 135 watts at full power (currently around 200 watts) estimates that the new model with 65-Nanometer-Versionen of RSX-Grafikchips from Nvidia and the Cell processor is equipped what IBM in March 2007 announced. Im Remote-Play-Modus ist der Stromverbrauch des neuen Modells von bislang 160 bis 180 Watt auf 120 bis 125 Watt gesunken. In remote-Play mode, the power consumption of the new model from previously 160 to 180 watts to 120 to 125 watts fallen. Neben einer neuen Revision der Hauptplatine (SEM-001) kommt beim neuen Modell auch eine neue Revision der Southbridge (CXD2984GB) zum Einsatz. In addition to a new review of the motherboard (SEM - 001), the new model is also a new revision of the Southbridge (CXD2984GB) are used. Gleichzeitig hat Sony die Heatpipe überarbeitet, welche nun weniger massiv wirkt. At the same time, Sony revised the Heat Pipe, which now seems less massive. Die Spannungsversorgung des Cell-Prozessors wurde von drei auf zwei Phasen reduziert, was ebenfalls auf die 65-nm-Variante schließen lässt. The voltage supply to the Cell processor was from three to two phases reduced, which is also on the 65-nm-Variante close. Zudem konnte die Geräuschentwicklung der neuen PlayStation 3 verringert werden. Moreover, the sound development of the new PlayStation 3 can be reduced. Statt 1,3 Sone verzeichneten die Kollegen der c't nun nur noch 0,5 bis 0,8 Sone. Instead of 1.3 Sone colleagues recorded the c't now only 0.5 to 0.8 Sone. Den verbauten Hauptspeicher (256 MB) bezieht Sony nicht mehr von Samsung, sondern setzt nun auf Elpida. The up main memory (256 MB) refers Sony no longer Samsung, but is now on Elpida. Zur Speicherung der Systemzeit integriert Sony darüber hinaus nun eine Knopfbatterie auf der Hauptplatine. To save the system time integrates Sony also is a button battery on the motherboard. Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

Tuesday, October 30, 2007

Storm Worm botnet computers for sale

Digg! Slashdot It! SecureWorks researcher Joe Stewart (left) has seen evidence that the massive Storm Worm botnet is being broken up into smaller networks, a surefire sign that the CPU power is up for sale to spammers and denial-of-service attackers. Stewart, a reverse engineering guru who has been tracking Storm Worm closely, says the latest variants of Storm are now using a 40-byte key to encrypt their Overnet/eDonkey peer-to-peer traffic. “This means that each node will only be able to communicate with nodes that use the same key. This effectively allows the Storm author to segment the Storm botnet into smaller networks. This could be a precursor to selling Storm to other spammers, as an end-to-end spam botnet system, complete with fast-flux DNS and hosting capabilities,” Stewart said in an e-mail message. “If that’s the case, we might see a lot more of Storm in the future,” he warned. The malware attacks behind this botnet have been relentless all year, using a wide range of clever social engineering lures to trick Windows users into downloading executable files with rootkit components. By some accounts, the malware has successfully created a massive botnet — between one million and 10 million CPUs — producing computing power to rival the world’s top 10 supercomputers. Statistics from Microsoft’s monthly updated MSRC (malicious software removal tool) peg the size of the botnet at the low end of the supercomputer speculation. Stewart sees a silver lining in the latest Storm Worm twist. Because of the new encryption scheme, Stewart says it is now easier to distinguish Storm-related traffic from “legitimate” Overnet/eDonkey P2P traffic. “[It] makes it easier for network administrators to detect Storm nodes on networks where firewall policies normally allow P2P traffic,” he said. Get paid to blog Photo Sharing and Video Hosting at Photobucket

Apple makes an handsome profit on iPhone

Slashdot It! dollar-iphone.jpgWe already knew that the iPhone was making Apple money by the barrelload, but we didn't know exactly how much. Well, uh, you'd better sit down. According to the Financial Accounting Standards Board, Apple receives a whopping $18 per month from AT&T for each iPhone customer. That adds up to $432 for each two-year contract, or, including the price of the phone, $831 per iPhone. Yes, Apple is making more than twice the retail price of the iPhone for every one sold. With cost analysis's putting the price of iPhone components at around $250, that means that Apple is pulling in around $580 in profit on each iPhone. Yeah, I know, they need to pay their designers and their advertisers and their rent and all the business overhead that comes with running a gigantic electronics company, but the fact of the matter is that Apple is pulling in over the full price of their iPhone in profit with each one sold. Now we know why they went with that five-year exclusive deal with AT&T, eh? It's basically a five-year contract with a money-printing machine. Protect your computer with Windows Onecare Ad Space

Using GPU and CPU to recover passwords

Digg! Slashdot It! ElcomSoft has discovered and filed for a US patent on a breakthrough technology that will decrease the time that it takes to perform password recovery by a factor of up to 25. ElcomSoft has harnessed the combined power of a PC's Central Processing Unit and its video card's Graphics Processing Unit. The resulting hardware/software powerhouse will allow cryptology professionals to build affordable PCs that will work like supercomputers when recovering lost passwords. Using the "brute force" technique of recovering passwords, it was possible, though time-consuming, to recover passwords from popular applications. For example, the logon password for Windows Vista might be an eight-character string composed of uppercase and lowercase alphabetic characters. There would about 55 trillion (52 to the eighth power) possible passwords. Windows Vista uses NTLM hashing by default, so using a modern dual-core PC you could test up to 10,000,000 passwords per second, and perform a complete analysis in about two months. With ElcomSoft's new technology, the process would take only three to five days, depending upon the CPU and GPU. Until recently, graphic cards' GPUs couldn't be used for applications such as password recovery. Older graphics chips could only perform floating-point calculations, and most cryptography algorithms require fixed-point mathematics. Today's chips can process fixed-point calculations. And with as much as 1.5 Gb of onboard video memory and up to 128 processing units, these powerful GPU chips are much more effective than CPUs in performing many of these calculations. In February, 2007 NVIDIA, the worldwide leader in programmable graphics processor technologies, launched CUDA, a C-Compiler and developer's kit that gives software developers access to the parallel processing power of the GPU through the standard language of C. NVIDIA GPUs (GeForce 8 and above) act as multiprocessors with multiple registers and shared memory and cache. ElcomSoft has harnessed their computing power, and will be incorporating this patent-pending technology into their entire family of enterprise password recovery applications. Since high-end PC mother boards can work with four separate video cards, the future is bright for even faster password recovery applications. Preliminary tests using ElcomSoft Distributed Password Recovery product to recover Windows NTLM logon passwords show that the recovery speed has increased by a factor of twenty, simply by hooking up with a $150 video card's onboard GPU. ElcomSoft expects to find similar results as this new technology is incorporated into their password recovery products for Microsoft Office, PGP, and dozens of other popular applications. Get paid to blog Photo Sharing and Video Hosting at Photobucket

The 10 Biggest Web Annoyances

Slashdot It! According the PC World, they published an article on the top 10 web annoyances The top web annoyance is 1. Dubious Privacy Policies Read more at PC World Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

Verizon Reports Continued Success in 3Q 2007

Slashdot It! Verizon Communications today reported another strong quarter of financial and operational results. Verizon Wireless continued its record of industry-leading profitability, Verizon Telecom reported accelerating sales of FiOS TV, and Verizon Business increased overall sales and sales of strategic services. Verizon reported third-quarter 2007 earnings of 44 cents in fully diluted earnings per share (EPS). This compares with third-quarter 2006 earnings of 53 cents per share before income from discontinued operations that have since been sold or divested. On an adjusted basis (non-GAAP), third-quarter 2007 earnings were 63 cents per share. This is a 14.5 percent increase, compared with 55 cents per share in the third quarter 2006 after excluding discontinued operations. Adjusted earnings in the third quarter 2007 reflect 19 cents per share in special items: 16 cents per share for international taxes, 2 cents per share for costs related to a previously announced spin-off of access lines and 1 cent per share in merger integration costs. Adjusted earnings in the third quarter 2006 excluded 2 cents per share in special items for pension settlement charges, and merger integration and Verizon Center relocation costs. Successful Transformation "Our third-quarter results show that we have hit our stride as a leading wireless, broadband and enterprise company," said Verizon Chairman and CEO Ivan Seidenberg. "In recent years, we have transformed our business model and revenue base. Our results throughout 2007, and especially in the third quarter, show that our strategy has been successful. We expect to build on these results in the fourth quarter and beyond." Strong Consolidated Results Verizon's total operating revenues grew 5.8 percent to $23.8 billion, compared with the third quarter 2006. Operating revenues grew 6.0 percent on an adjusted basis (non-GAAP). Verizon's total operating expenses increased 3.4 percent to $19.6 billion, compared with the third quarter 2006. Operating expenses increased 3.5 percent on an adjusted basis (non-GAAP). On a reported basis, Verizon's operating income grew 19.0 percent to $4.2 billion, compared with the third quarter 2006. On an adjusted basis, operating income grew 18.5 percent to $4.3 billion. Operating income margin rose to 17.7 percent, compared with 15.7 percent in the third quarter 2006. On an adjusted basis, Verizon's operating income margin rose to 18.1 percent, compared with 16.2 percent in the third quarter 2006. Cash flows from continuing operations totaled $18.0 billion through the first nine months of 2007. This represents 5.0 percent growth over the same period last year. Dividends, Repurchase Program Reflect Confidence Reflecting confidence in Verizon's business model and continued strong cash flows, Verizon's Board of Directors announced during the third quarter that it had increased the company's quarterly dividend 6.2 percent, beginning with the Nov. 1 dividend. Verizon also repurchased nearly $800 million of its shares in the quarter, for a total of $1.7 billion in the last nine months. Verizon is increasing the 2007 target for its share repurchase program to $2.5 billion, up $500 million from the original target. Wireless Continues to Lead Industry Verizon Wireless extended its record of strong, industry-leading performance. It continues to be the largest domestic wireless company in total revenues, data revenues and retail customers. In the third quarter: * Nearly all of the 1.8 million retail net customer additions (including acquisitions and adjustments) were post-paid customers. * Total customers (retail and wholesale) increased to 63.7 million. The company added 1.6 million total net customers after approximately 115,000 net reductions to the company's wholesale base. * Verizon Wireless continued its industry-leading customer loyalty, with 1.21 percent retail churn. Churn among retail post-paid customers at 0.96 percent was substantially lower. * Revenues totaled $11.3 billion, up 14.4 percent. Service revenues were $9.7 billion, up 15.1 percent, driven by customer growth and demand for data services. * ARPU levels (average monthly revenue per customer) were the company's highest ever: $52.17 retail service ARPU, up 1.9 percent year over year; $10.59 retail data ARPU, up 42.9 percent. * Wireless operating income margin was 27.1 percent, the company's second highest. EBITDA margin on service revenues (non-GAAP) was 44.7 percent. Wireline Reports Strong Growth in FiOS, Strategic Services Verizon's Wireline business, which includes Verizon Telecom and Verizon Business, reported continued strong growth in customers of FiOS fiber-optic services and sales of strategic services to enterprise customers. In the third quarter: * Verizon added a net of 202,000 new FiOS TV customers. The company has 717,000 FiOS TV customers in total, with approximately 600,000 added within the past 12 months. Including satellite TV customers served in partnership with DIRECTV (a net of 85,000 added this quarter), Verizon has more than 1.5 million video customers. * Verizon added a net of 285,000 new broadband connections (DSL and FiOS Internet connections combined). Broadband connections totaled 8.0 million, an increase of 21.3 percent compared with the third quarter 2006. The company added a net of 229,000 FiOS Internet connections this quarter, for a total of 1.3 million. * ARPU in legacy Verizon wireline markets (which excludes former MCI consumer markets) increased 10.8 percent to $58.79, compared with last year's third quarter. This increase was due to strong demand for broadband and TV services. * Wireline operating income margin rose to 9.4 percent, compared with 8.8 percent in last year's third quarter. * Verizon Business had revenues of $5.3 billion, or growth of 2.2 percent compared with last year's third quarter on an adjusted basis (non-GAAP). This is Verizon Business' fourth consecutive quarter of year-over-year, pro-forma revenue growth (non-GAAP, calculated as if Verizon and MCI had merged on Jan. 1, 2005). * Strong sales of key strategic services -- such as IP (Internet protocol), managed services, Ethernet and optical ring services -- continued to drive Verizon Business' growth. These services generated $1.4 billion in revenue, up 28.6 percent from last year's third quarter. Protect your computer with Windows Onecare Get Paid $7.50 for reviewing my post Ad Space

Monday, October 29, 2007

Some Leopard upgraders see 'blue screen of death'

Digg! Slashdot It! A significant number of Mac owners upgrading to Leopard on Friday reported that after installing the new operating system, their machines locked up, showing only an interminable -- and very Windows-like -- "blue screen of death." Easily the heaviest-trafficked thread on the Leopard support forums as of late Friday, "Installation appears stuck on a plain blue screen" told how after a successful Leopard install using the default "Upgrade" option and the required restart, some users' Macs refused to budge from the blue screen. Although many gave up after 30-60 minutes and rebooted, others were more patient and let their Macs be as long as six hours. "Hmmmm. I feel like a windoze user now," said Doug Mcilvain. I have re-installed and it has been sitting there with a blue screen for 4 1/2 hours." Almost everyone who added to the thread -- which included more than 200 messages and over 7,400 views by 10:30 p.m. Friday, Pacific time -- selected the Upgrade option. Set as the default, Upgrade is the least intrusive of the three install options. "Most of your existing settings and applications are left untouched during an upgrade," Apple states in an online support document. In fact, some reports speculated that the glitch might be related to a third-party program that installs a base-level framework that modified OS X. Frustrated users who rebooted to the install DVD then upgraded a second time using the "Archive and Install" option reported success, and no lingering blue screen after restart. "I grew impatient after the first hour and rebooted to DVD and then reinstalled choosing the Archive/Install option," said volksapple. "That worked just fine. Despite this small hiccup, it's far better than any Windows upgrade I've suffered through." Other users, however, waited it out, or were told to by Apple support personnel. One user, James Mitchell9, said the blue screen finally vanished at the 75-min. mark. Others claimed they had been told the long blue-screen-of-pause could last as long as two, or even three, hours. Still others jumped in with instructions to manually uninstall APE (Application Enhancer), a framework created by Unsanity LLC for use with its Mac customizing haxies such as ShapeShifter. "Please note that this does involve manipulation of files from the root prompt," cautioned Chris Mcculloh, who first made the suggestion. "This is not for the faint of heart, or those who are unfamiliar with the UNIX file system/command structure." Mcculloh listed the steps as: * Reboot into single-user mode (hold Command-S while booting) * Remove the following files by typing each line below: rm -rf /Library/Preference Panes/Application Enhancer.prefpane rm -rf /Library/Frameworks/Application Enhancer.framework rm -rf /System/Library/SystemConfiguration/Application Enhancer.bundle rm -rf /Library/Preferences/com.unsanity.ape.plist * Exit, to continue booting normally, type: exit For those who have not yet installed Leopard, the Unsanity APE app provides an uninstaller that can be used first to remove the framework before the new OS is installed. Alternatively, users can use the "archive and install" option, which places a new copy of Leopard on the user's computer while moving older OS files to another folder. That would have the effect of moving the potentially offending APE software to a location where it can do little to interfere with the installation process. Apple was not available for comment Friday night, but an Australian user claimed support said the phones had been ringing "non-stop" over the problem since 9 a.m. local time. Australia was one of the first countries where Mac users got their hands on Leopard. A few took the install screw-up in stride, or at least kept their sense of humor. "I don't remember seeing the option in setup under Installation Type that said 'Wait indefinitely while you stare at a blue screen and eventually go mad'," said Phill Horrocks1. Get paid to blog Photo Sharing and Video Hosting at Photobucket

The future of gaming

Digg! Slashdot It! Sure it sucks to get killed while playing a shoot-'em-up, but you know you can just get up and start over again. But what if you actually felt all that carnage? Would it make you think twice before charging in? ThirdAgeVest scaled.jpg Don the 3rdSpace gaming vest, and you'll be feeling gunshots, missile attacks, kicks, punches, and other types of body impact. Designed by a surgeon, the vest was originally created for use in the medical field to poke and prod patients in order to get a sense for what they were feeling. Since then, the vest has been adapted for the game industry, capable of delivering hits and shots exactly where you would feel them. Utilizing air pouches--four on front, four in back--the vest nudges and jabs gamers at eight different contact points. The vest, uh, hits in November for $189, and will ship with Call of Duty. Another vest is also in the works, this one aimed at flight and driving sims. What do you think? Would you buy this vest just to play games? Most likely I am going to buy one to make my gaming more fun. Get paid to blog Photo Sharing and Video Hosting at Photobucket

Terabyte Thumb Drives coming soon

Digg! Slashdot It! Researchers have developed a low-cost, low-power computer memory that could put terabyte-sized thumb drives in consumers' pockets within a few years. Thanks to a new technique for manipulating charged copper particles at the molecular scale, researchers at Arizona State University say their memory is, bit-for-bit, one-tenth the cost of -- and 1,000 times as energy-efficient as -- flash memory, the predominant memory technology in iPhones and other mobile devices. "A thumb drive using our memory could store a terabyte of information," says Michael Kozicki, director of ASU's Center for Applied Nanoionics, which developed the technology. "All the current limitations in portable electronic storage could go away. You could record video of every event in your life and store it." The new memory technology -- programmable metallization cell (PMC) -- comes as current storage technologies are starting to reach their physical limits. At the tiny scale envisioned for new devices, flash memory becomes unstable. The physical limits of flash are already being approached, and could be reached in the near future, which could slow product development for portable device makers like Apple and Sandisk. PMC memory stores information in a fundamentally different way from flash. Instead of storing bits as an electronic charge, the technology creates nanowires from copper atoms the size of a virus to record binary ones and zeros. In research published in October's IEEE Transactions on Electron Devices, Kozicki and his collaborators from the Jülich Research Center in Germany describe how the PMC builds an on-demand copper bridge between two electrodes. When the technology writes a binary 1, it creates a nanowire bridge between two electrodes. When no wire is present, that state is stored as a 0. The key enabling technology for the memory is nano-ionics, a field that focuses on moving and transforming positively charged atoms. In PMC memory, the charged atoms, or ions, are harnessed by applying a negative charge, which transforms them into copper atoms lined up to form nanowires. Kozicki says the process is like condensing a crystal from a solution, except that the process is almost infinitely reversible. If the PMC is fed a positive charge, the copper atoms return to their previous free-floating state, and the nanowires disassemble. Kozicki says the technology can be built from materials commonly used in the memory industry, which should help keep manufacturing costs down. The memory industry has already taken an interest. Three companies, Micron Technology, Qimonda and Adesto (a stealth-mode startup) have licensed the technology from Arizona State's business spin-off, Axon Technologies. Kozicki says the first product containing the memory, a simple chip, is slated to come out in 18 months. Market-research firm iSuppli projects the flash-memory market growing from $20 billion in 2006 to $32 billion in 2011. Mark DeVoss, a senior analyst in flash memory at iSuppli, says a lot of companies are gunning for a share of that $12 billion in growth, but it's hard to handicap the likely winners. "There's a lot of elegant technologies," DeVoss says. "But you have to be able to scale it down and deliver a low cost-per-bit." Kozicki's licensees believe the technology will deliver the outsize improvements that could drive the memory mainstream. "No other technology can deliver the orders-of-magnitude improvement in power, performance and cost that this memory can," says Narbeh Derhacobian, CEO of Adesto, who previously worked at AMD's flash-memory division. Adesto has received $6 million from Arch Venture Partners and additional funding from Harris & Harris, a venture firm specializing in nanotechnology. Get paid to blog Photo Sharing and Video Hosting at Photobucket

New mode of Jet Fuel

Digg! Slashdot It! Princeton Professor of Mechanical and Aerospace Engineering Fred Dryer has a lofty goal: end the nation's reliance on oil for jet travel. With potentially major benefits for energy security and the environment riding upon his success, Dryer is advancing the fundamental knowledge of jet fuels while developing practical, innovative energy sources. "In order to make alternative jet fuel sources feasible, they need to be compatible with petroleum and produce similar combustion performance," Dryer said. "This will only be possible if we fully understand how both petroleum and alternative fuels burn and design engines based on this fundamental knowledge." Backed by government and industry grants, Dryer is leading two new research efforts to advance these technologies. The first, a major project funded by the U.S. Air Force, is focused on developing computational and kinetic models that accurately simulate the burning of jet fuel, a complex and poorly characterized mix of chemicals. At the same time, he is putting his basic understanding to use as he develops jet fuels with near-zero net greenhouse gas emissions in a project funded by NetJets, a leading provider of business jets. The Air Force program is one of the Defense Department's highly competitive Multi-disciplinary University Research Initiative (MURI) grants. One of only ten such projects supported by the Air Force this year, the collaboration involves researchers from four institutions -- Princeton, Case Western Reserve University, Pennsylvania State University and the University of Illinois-Chicago. The award, with an overall value of up to $7.5 million, will provide support for three years with the option of a two-year extension. Research began in July and the kick-off meeting for the project will be held Sept. 17 in Princeton. Dryer and his MURI collaborators, including Princeton Associate Professor of Mechanical and Aerospace Engineering Yiguang Ju, will develop methods to predict and evaluate how jet fuels will behave in actual engines and characterize the emissions they will produce. While current guidelines specify some overall properties of jet fuels, they do not spell out the actual chemical composition. Depending on the source and processing method, jet fuel typically consists of hundreds to thousands of molecular structures that behave in a variety of ways. The models developed by the team will represent and characterize the behavior of this broad range of jet fuel species using only a few types of molecular structures as surrogates for the larger whole. Dryer previously developed similar "surrogate fuel" models to represent gasoline, which are now being used for engine design by the automotive industry. "The composition of fuels changes with the geographic source, the refining process and even with the season," Dryer noted. "Since we have an energy security problem, we need to be sure that alternative fuel sources are going to work and, in order to do that, we need to understand exactly how petroleum-based fuels work alone and in combination with alternative fuels." Alternative energy sources, if designed appropriately, also could significantly reduce the amount of greenhouse gasses released in creating and burning jet fuel. According to the U.S. Department of Transportation, aviation is responsible for around 10 percent of the greenhouse gas emissions from transportation in the nation, or roughly 2.7 percent of the country's total greenhouse gas emissions. The second research program, supported by NetJets, augments Dryer's fundamental MURI work and brings in additional expertise from the Princeton Environmental Institute to develop "greener" alternative fuels. "NetJets is pleased to be working with the engineers and scientists at Princeton to develop new jet fuels with near-zero net greenhouse gas emissions," said NetJets Chairman and CEO Richard Santulli. "Princeton has a longstanding history of leadership in aerospace science. We feel they will make great strides with this research." The NetJets-funded research project provides the opportunity to make substantial progress toward the launching of green technologies not only for corporate jets, but also for commercial aviation and transportation in general, according to Robert Williams, a senior research scientist at the Princeton Environmental Institute and member of the NetJets-sponsored research team. At Princeton, the team also includes Ju and Eric Larson, a research engineer at the Princeton Environmental Institute. In addition, the work will involve collaboration with researchers at the Institute of Transportation Studies at the University of California-Davis. Two alternative fuel sources that are the subject of much investigation in the aviation field -- coal and biomass -- present a major quandary to researchers attempting to develop low-emissions jet fuels. Coal, a relatively cheap and readily available source of energy, has an emissions profile at least as harmful as petroleum. Biofuel -- fuel made from plants -- presents an attractive alternative because the carbon dioxide emitted from burning biomaterials will be removed from the atmosphere by a new generation of plants during photosynthesis. The production of biomaterials, however, requires the intensive use of land, limiting the feasibility of widespread production of biofuels. To take advantage of the positive characteristics of each of these sources, the Princeton researchers will center their efforts on the synthesis of jet fuels from a combination of coal and biomass. A key component of their solution is isolating and storing the carbon dioxide produced during the production of so-called synfuels. This technique, called carbon capture and sequestration, is a promising strategy being investigated intensively by Princeton's Carbon Mitigation Initiative, among other research programs. An "especially attractive feature" of processing coal and biomass together to make synfuels is that it requires only half the amount of biomaterial as pure biofuel production, while still making fuels with near-zero greenhouse gas emissions, Williams said. The ultimate success of the research efforts will depend on how well the synfuels compare with traditional fuel sources, in terms of fuel characteristics, costs and environmental and safety issues, the researchers said. "There is no doubt that developing feasible alternatives to petroleum for the aviation industry will be a long and expensive process," Dryer said. "And success, in the form of an enduring solution, will be priceless." Via Princeton Get paid to blog Photo Sharing and Video Hosting at Photobucket

Hacker betrays the piracy industry

Digg! Slashdot It! Promises of Hollywood fame and fortune persuaded a young hacker to betray former associates in the BitTorrent scene to Tinseltown's anti-piracy lobby, according to the hacker. In an exclusive interview, gun-for-hire hacker Robert Anderson tells for the first time how the Motion Picture Association of America promised him money and power if he provided confidential information on TorrentSpy, a popular BitTorrent search site. According to Anderson, the MPAA told him: "We would need somebody like you. We would give you a nice paying job, a house, a car, anything you needed.... if you save Hollywood for us you can become rich and powerful." In 2005, the MPAA paid Anderson $15,000 for inside information about TorrentSpy -- information at the heart of a copyright-infringement lawsuit brought by the MPAA against TorrentSpy of Los Angeles. The material is also the subject of a wiretapping countersuit against the MPAA brought by TorrentSpy's founder, Justin Bunnell, who alleges the information was obtained illegally. Hacker Robert Anderson first approached the Motion Picture Association with a plan to help the movie studios' lobbying arm beat piracy. Among other things, Anderson proposed to implement an anti-piracy marketing campaign for the MPAA. Here's a screenshot of his Powerpoint pitch, illustrating for the MPAA the Bittorrent ecosystem. The MPAA does not dispute it paid Anderson for the sensitive information, but insists that it had no idea that Anderson stole the data. "The MPAA obtains information from third parties only if it believes the evidence has been collected legally," says MPAA spokeswoman Elizabeth Kaltman. The MPAA's use of Anderson is one of a series of controversies the movie industry is confronting in its zero-tolerance war on piracy. MediaDefender, a California company that tracks and disrupts file sharing of movies and music, was reported to Swedish authorities last month by The Pirate Bay, after an internet leak revealed the extent to which MediaDefender pollutes file-sharing services with fake, decoy content. And an executive at a national theater chain successfully pressed New Jersey authorities in August to prosecute a teenager for filming 20 seconds of a movie at a theater to show to her little brother later. Anderson's account shows that the content industry may be willing to go to significant -- and some say ethically questionable -- lengths in its war against online piracy, and that it is determined to keep its methods secret. "It was an understanding," Anderson says of the deal, "that it was hush-hush." Anderson's brief Hollywood career began in the spring of 2005, after a online advertising venture with TorrentSpy founder Bunnell turned sour. Looking to profit in other ways, Anderson approached the MPAA with an e-mail offering to help the movie studios' lobbying arm beat piracy, which the industry says costs it billions in lost sales each year. Among other things, Anderson proposed to implement an anti-piracy marketing campaign for the MPAA. But he says he also offered to provide inside information on TorrentSpy, which, along with The Pirate Bay, is among the most popular BitTorrent destinations for downloaders looking for free movies and music. "It was an opportunity to make money, because I knew how these networks operated," he says. On June 8, 2005, within weeks of sending his unsolicited e-mail, Anderson says he was put in touch with the MPAA's Dean Garfield, then the organization's legal director. Anderson says he told Garfield that he had "an informant that can intercept any e-mail communication." Anderson didn't tell Garfield he was the "informant," and that he'd already hacked into TorrentSpy's systems. The hacker, then 23 and living in Vancouver, British Columbia, claims he had cracked TorrentSpy's servers by simply guessing an administrative password. He knew the password was weak -- a combination of a name and some numbers. "I just kept changing the numbers until it fit," he says. "I guess you can call it luck. It took a little more than 30 tries." Once inside, he programmed TorrentSpy's mail system to relay e-mail to a newly created external account he could access. There's a trace of pride in his voice as he details the hack. "The e-mails weren't forwarded using the mail command. They were sent actually before it reached anyone's mailbox," he says. "So it was more like interception before delivery. I could even stop certain mail from reaching their box." In this manner, Anderson says, he sucked down about three dozen pages of e-mails detailing banking, advertising and other confidential information. "Everything they were talking about was sent to my Gmail," he says. "Everything they sent, anything sent to them, I got: invoices; in one case they sent passwords." Among the purloined files was the source code for TorrentSpy's backend software, says Anderson. Anderson alleges this interested the MPAA, which he says wanted to set up a fake BitTorrent site of its own. According to Anderson, the MPAA said, "We'll set up a fake Torrent site. We'll contact the other Torrent sites. We'll get their names, address books, contact information and banking information.... (They) wanted to run this as a shadow portion of the MPAA." MPAA spokeswoman Kaltman says the MPAA had no such plans, and says the accusation that the MPAA wanted to set up a phony Torrent site is "patently false." Via Wired Get paid to blog Photo Sharing and Video Hosting at Photobucket

Sunday, October 28, 2007

Nokia: Content key in boosting cell phone use

Slashdot Slashdot It! Nokia, the world's largest cell phone maker, sees content for the wireless Internet as key to snapping up more mobile phone users, including in emerging markets, the head of its key unit said on Tuesday. Kai Oistamo, who heads Nokia's mobile phone division, said the biggest barrier in emerging markets to people using the mobile Internet was lack of interest. "We are looking as a company as to how to facilitate and participate in creating content that is actually relevant for the consumers in emerging markets," Oistamo said in a speech at a telecommunications forum at Helsinki University of Technology. The target market will define what will globally be the next big thing in mobile phones, Oistamo said, but added that location-based services combined with social networks would be one of them. Nokia's recent acquisitions have supported a content strategy as the company has made moves to transform itself from being a pure hardware company into becoming an online services company. In July Nokia said it would acquire social networking and photo-sharing site Twango, as millions of users flock to similar sites like MySpace and Facebook. Nokia also recently launched Internet services portal Ovi. And earlier this month Nokia launched an $8.1 billion offer for the U.S. digital-map supplier Navteq, which would be its largest ever acquisition. Via Cnet Get paid to blog Ad Space