Century 2TB SSD
This SDB25CF is available right now at GeekStuff4u.com
The same technology used to track pigs headed for your breakfast table may soon help their wild cousins in the Amazon.
As part of a five-year study of biodiversity in the Amazon, the World Wildlife Fund recently began using radio-frequency identification, or RFID, transponders to track white-lipped peccaries -- tusked pig-like animals weighing 100 pounds or more.
Every few days, peccaries descend on natural salt and mineral deposits called collpas to chow down on clay that aids in digestion and supplements their mineral-deficient diet. For the study, WWF researchers are tagging peccaries from different herds with RFID transponders. Four RFID readers in strategic points around the site passively register data on the tagged animals' visits for WWF staff to download later.
Although white-lipped peccaries are not endangered, they travel far and in large numbers, with as many as 400 animals in a single herd. Because of this, says George Powell, senior conservation scientist at the WWF, it's important to understand their habitat needs now, before rain forests in southeastern Peru and neighboring Bolivia and Brazil are cleared for logging, cattle or soy farming.
Now common in many supply chains, RFID tags help track apparel, shipping containers and cattle, and have been used in field research to trail wild salmon and even individual trees.
In the past, wildlife researchers were limited to costlier, labor-intensive means of tracking wildlife, such as very-high-frequency or GPS collars. But cost and convenience are making RFID tags an attractive alternative for some types of field studies. "VHF costs at least $300 per animal, and GPS systems cost about $3,000 each," says Powell. "Ear tags are a few cents a piece, so you can put out hundreds of them."
Also, with VHS and GPS tracking technologies, researchers have to drive or fly over a huge area -- or depend on a satellite -- to pick up a signal from tagged animals. And the signals can be obscured by the rainforest's canopy. Because peccaries take a predictable path to their pit stop, researchers can install readers on that path and then sit back and let the transmitters come to them.
The RFID tags' small size also allows the WWF to study young animals as they grow.
"You can' t put a collar around the neck of an animal that's going to grow. You'll choke it to death," says Powell. "Now, you can put RFID ear tags on little peccaries and see how long it takes them to grow up and if they survive. "
The World Wildlife Fund plans to use the information gathered in the peccary study, in conjunction with non-RFID research on jaguars, pumas and several parrots, to determine how large a protected area these wide-ranging species need to survive in the Amazon. The animal-protection group hopes such objective data will help it build international consensus on land management.
"A lot of times people say they are going to establish a park there, but do they know what they need in order to make it functional? " says Powell. "Look at Yellowstone National Park, one of our country's biggest. It was created without (this kind of) information, and it's not big enough to maintain grizzly bears -- an animal that is very important to the Yellowstone ecosystem. "
source : wired.com
Posted Jun 30th 2007 11:51AM by Darren Murph
Filed under: Storage
Posted Jun 30th 2007 2:37PM by Darren Murph
Filed under: Robots
The nightmare scenario that United Airlines and its passengers endured last week when its computer system shut down for two excruciating hours will happen again and again in other industries, analysts say, unless CIOs make significant changes in the way they manage their staffs and their systems.
The Chicago-based airline's June 20 debacle illustrates just how vulnerable a company can be to an innocent mistake—in this case, a United spokeswoman says an employee accidentally disabled the computers while running a routine test on Unimatic, United's flight operations system—and how far-reaching the consequences can be.
More than 260 United flights were delayed for an average of 90 minutes, and nearly 70 other flights were cancelled altogether. The timing couldn't have been worse. The system crashed between 8 a.m. and 10 a.m. Chicago time, right in the heart of the morning rush, leaving arriving United flights to scramble for the few available gates that weren't blocked by delayed United planes that were unable to take off.
"As soon as it happened, the I.T. employee who made the mistake knew what it meant to the system," says Robin Urbanski, a United spokeswoman. "It was just simple human error during a routine test of our flight operation system. It's unfortunate, but we're developing new processes that will prevent future issues like these from impacting our customers."
Making matters worse, according to Urbanski, was the fact that this testing snafu also knocked out Unimatic's backup system. She declined to comment on what new processes would be implemented, or if new software and hardware would be installed.
The Unimatic system, which was originally launched almost 20 years ago, provides pilots with flight plans, gives updates on maintenance information and crew schedules, and records the amount of weight and proper balancing of that weight for all outbound flights. Urbanski says the system has been updated repeatedly in recent years.
All of these applications, including the flight plans, crew scheduling and weight measurements, can still can be done by hand in a pinch—but at a cost.
Michael Boyd, an airline consultant based in Evergreen, Colo., told the Chicago Tribune that United's computer glitch will cost it more than $10 million in lost revenue due to refunds and re-bookings, to say nothing of the hit its reputation took in the process. Worse, while United passengers and planes were stranded at the gates, its competitors were up and running as usual. This wasn't a weather-related incident; it was self-inflicted.
"Of course, organizations could do more to prevent such debacles," says Tom Welsh, senior consultant at Cutter Consortium, an Arlington, Mass.-based I.T. consulting firm. "The live and backup machines should be more rigorously separated, and there should be someone present to make sure these [innocent mistakes] don't happen. Companies strive to maximize profits, but often fallible decision-makers judge the risks wrong."
Analysts say the financial services, retail and transportation industries are particularly vulnerable to the type of systemic paralysis experienced by United last week because they most conform to rigid and unrelenting schedules.
While each company or organization has unique I.T. issues and limitations, analysts say CEOs and CIOs should consider the following advice to avoid the type of disruption United and its customers experienced:
Despite inconveniencing thousands of passengers, taking a likely eight-figure hit to its bottom line and disrupting service for more than 24 hours, United was fortunate that the computer outage only lasted two hours. Had the problem lingered for another two or three hours, analysts say, the cascading effect would have sent United's operations into a tailspin for as much as a week.
"United now knows that it made an incorrect cost-benefit judgment," says the Cutter Consortium's Welsh. "Now that they're aware, they will probably do the right thing and urgently appoint qualified people to analyze the scenario to make sure it doesn't happen again. These precautions are analogous to seat belts or crash helmets—they save lives but often are left off anyway."
source : baselinemag.com
Finding someone who doesn’t believe the iPhone hype is not an easy task these days.
But Arnie Berman, the chief technology analyst at investment bank Cowen, has an interesting take on the iPhone and all the talk about how it will revolutionize the mobile Internet. He thinks investors should sell Apple (AAPL), whose stock has surged 35 percent in the past 2 months, and buy Google (GOOG) instead.
In a report to clients Friday morning, Berman argues that Google stands to benefit as much, if not more, from the mobile Web becoming a reality than Apple.
“At this stage we would rather commit fresh capital to stocks that have not already been bid up in paroxysms of excitement over the ‘mobile Internet’,” Berman wrote. “Apple shares have already benefited from a powerful hype cycle. In the months to come, Google is the much stronger candidate to benefit from a hype cycle whose DNA is similar to the one that has propelled Apple’s share price to dizzying heights.”
Berman’s contention is that if more and more people use their cell phones as devices to access the Web, Google winds up a winner because it should be able to capitalize on increased mobile search revenue. And the beauty for Google is that it doesn’t have to make a bet on hardware or carriers. As long as people use their handset, be it an iPhone, the latest device from Palm (PALM), Motorola (MOT) or Nokia (NOK) or even a BlackBerry from Research in Motion (RIMM), to do searches, Google should win.
“Google’s ability to capitalize on the emergence of a pervasive high bandwidth mobile Internet is much more assured than Apple’s ability to do so,” Berman wrote, adding later on his report that “the hardware business is inherently more unpredictable than the atomistic revenue streams associated with search. In our view, Apple remains a hits company. In fact, Apple is just one product cycle miss away from being treated with all the love and tenderness institutional investors reserve for Motorola.”
Berman also points out that Google and Apple now trade at the same price-to-earnings ratio, about 33 times earnings estimates for the next four quarters. That, he says, just defies logic considering that Google is a more profitable company that is growing faster than Apple.
“Should Apple and Google trade at the same valuations? We think the correct answer to this question is ‘absolutely not.’ Growth, risk and return are the holy trinity of corporate financial performance. Google trounces Apple in 2 categories – with vastly superior growth and a less risky business model. In the third category (return –measured here using both ROA and ROE), Google again trumps Apple – but by a smaller degree,” he wrote.
And Berman makes yet another interesting point about how iPhone hype has had an absurd impact on Apple’s stock. He argues that most of Apple’s appreciation during the past two months is due to the iPhone. That’s probably valid. And the run-up in Apple’s stock price has added $24 billion to the company’s enterprise value, which is equal to a company’s market value minus its cash but including debt and preferred stock.
So if you assume that the market believes iPhone is worth $24 billion in value to Apple’s stock, then this, Berman points out, means that the iPhone is worth two-thirds as much as all of Motorola and a third as much as Nokia. Is that reasonable? Even if Apple blows past its public goal of selling 10 million iPhones by the end of 2008, which Berman thinks it could, it still wouldn’t be enough to justify the stock’s run.
“Even under the most optimistic scenario, iPhone volumes will be paltry relative to MOT and NOK. These companies have operator relationships all over the planet - and will ship a combined total of roughly 650 million handsets in 2008,” Berman wrote, adding that Apple would need to sell over 600 million iPhones in the next 5 years in order for its iPhone business to deserve a valuation similar to Nokia’s. No matter how successful the iPhone is, 600 million sold in five years is extremely unlikely.
Don’t get me wrong. This is not to say that the iPhone won’t be cool. I’m no Apple hater. If it weren’t for the fact that I live in New York and AT&T’s (T) coverage is pretty abysmal here, I’d consider buying one. But the notion that Apple will revolutionize the mobile Web and dominate it the way it does with the music player market rings a little false. And as Berman wisely points out, there are plenty of other ways to play this trend that give investors a better bang for their buck.
source : cnn.money.com
Despite having almost two years to prepare for a predictable onslaught of new passport applications this summer, the U.S. State Department and its information processing systems are now so backlogged that Congress is cracking the whip on the behalf of irate travelers who are waiting and waiting and waiting for their passports. The debacle, which began in April 2005 when the Homeland Security and State departments instituted new travel regulations requiring Americans flying to nearby countries such as Canada, Mexico and those in the Caribbean to carry a passport, provides CIOs with a chilling reminder of just how wrong things can go if your organization isn't able to ramp up both its systems and its staff ahead of anticipated, record—breaking demand.What used to take between four and six weeks is now taking three or four or sometimes even five months, leaving travelers angry, stressed out and uncertain if they'll receive their passports in time for their vacations. In a letter last week, no less than 56 senators lambasted the State Department's woeful performance and called on Secretary of State Condoleeza Rice to intercede now before any more travelers are inconvenienced.
"People are angry and frustrated," says Rick Webster, a lobbyist for the Travel Industry Association of America. "Companies have to respond to shareholders. The government has to respond to constituents. [The State Department] anticipated the demand but they guessed far too low. It shouldn't have come to this." State Department analysts say the agency originally predicted about 15 million passport applications this year, up from 12 million in 2005. Then the figure was revised to 16 million. Now, they're expecting upwards of 17.5 million or more. Despite hiring an additional 130 passport workers and expanding the number of locations that accept applications from 7,500 to 9,500 this year, the frustrating delays continue and will likely continue though the fall. Could something like this happen in the private sector ahead of a new product release or perhaps a new compliance or regulation deadline? "A good CIO would have the lead members of his or her team interview key internal clients or constituents so the IT department has the clearest possible understanding of the business situation," says Henry Harteveldt, an analyst at Forrester Research Inc. "Technology is the enabler, and can do almost anything, provided the people who are responsible for the managing that technology have the clear insight from those who run the business side of the enterprise." For the State Department and the passport processing sites, the situation was further complicated when it turned to the private sector for help. Citicorp, which processes the passport fees for the agency, hired an additional 400 workers to an existing staff of 800 to help clear the backlog. However, a State Department spokeswoman told the Los Angeles Times that once the new hires were trained and began processing the fees, all that paperwork was sent over in enormous batches, further swamping the overwhelmed passport processors. Citicorp officials were not available for comment. "On the corporate side, there are efficiencies and incentives built into the process to avoid something like this from happening," Webster says. "In government there's only accountability to the citizens." Also, unlike a Sarbanes—Oxley deadline or a new product release, the government had the luxury of relaxing the new travel requirements it had created for itself. That's not how it works in the private sector. One company that's about to experience a similar deluge in demand, AT&T, announced Thursday that it had hired an additional 2,000 temporary workers to meet what's expected to be frenzied demand for Apple Inc.'s iPhone on June 29. As the only carrier to provide service for the iPhone when it debuts, AT&T's sales staff received a total of 100,000 hours of training to sell and support the device, according to spokesman Mark Siegel. However, Siegel told Baseline the company would not discuss how much of the training and additional resources were allocated for updating or improving order processing and customers service systems. And while the State Department desperately struggles to catch up with the millions of applications it's processing this year, travel analysts say the worst may be yet to come. In 2009, Americans traveling by land and sea—in addition to air travel—to nearby countries will also have to carry a valid passport, adding another 26 million applications to the pile. "The State Department got caught because the normal turnaround time is six weeks," Harteveldt says. "They should have known that human nature being what it is; people generally wait until the last minute to do things." source : baselinemag.com |
By Glenn Fleishman
Senza Fili Consulting’s latest WiMax report predicts a relatively large market for both mobile and fixed uses of “mobile” WiMax: The report says that 54m subscribers could be signed up by 2012, but that emerging markets and the U.S. will be key to WiMax’s worldwide uptake. 54m subscribers will be a relatively low number compared to broadband—300m broadband subscriptions are in use today, according to a report earlier this week—and quite low compared to the billions of cell phone voice users expected by 2012.
The report author Monica Paolini notes that the truly mobile WiMax access devices—not nomadic ones that require AC power or are bulky—will lead to increased adoption. That’s the goal of Sprint and Clearwire, certainly; they want PC Card form factor WiMax cards next year. Paolini also suggests that portable data devices with WiMax built in will be key in developing nations for adoption.
source : wimaxnetnews.com
By Glenn Fleishman
OnAir’s in-flight cellular GSM satellite-backed system received approval from the European Aviation Safety Authority (EASA): However, EASA approves airworthiness—the idea that a certified item won’t cause interference with the avionics or mechanical systems of a plane. That’s just one of many remaining hurdles before OnAir’s service is activated an Air France A318 as early as September.
In an interview a few weeks ago with OnAir chief commercial officer Graham Lake, he explained that in addition to certification of the GSM picocell system, the satellite connection to Inmarsat also required certification. Then each spectrum regulator over which planes equipped with such gear might fly must also provide approval—that’s 34 countries in Europe alone. The firm had 12 of 34 approvals needed as of last month; they expect to be approved in almost all European nations by the end of 2007.
OnAir has been working for years to provide in-flight mobile data and mobile calling using Inmarsat’s fourth-generation satellite system. Inmarsat’s satellite launches were severely delayed, the first of them by a year, and the delivery, rollout, and certification of airliner equipment has lagged as well. OnAir uses an onboard picocell system to allow GSM-based phones and handhelds to use GSM and GPRS for voice and data. Each airline will choose what combination of service they need. Wireless carriers will set the ultimate price for voice calls, expected to be about US$2.50 per minute.
The Air France launch will start with just SMS (text messaging) and GPRS-based data. SMS messages will cost about 50 cents (U.S.) each, while GPRS pricing is still being sorted out. It’s possible due to routes and timing that RyanAir would have the first picocell-operating plane in the air. Air France is using a single Airbus plane equipped for its trial; RyanAir is having its entire fleet of Boeing’s retrofitted.
Update: The International Herald Tribune gets two elements of the story wrong. First, they lead with the notion that EASA’s approval opens the door to mobile phone on planes. Per my notes above, it does not. There’s still spectrum regulation to be nailed down, and certification for the satellite kit, which OnAir said had not yet happened when I spoke to them a few weeks ago.
Second, while Wi-Fi may be a future option for OnAir, it’s not in the near future. It would be ruinously expensive to offer Wi-Fi-based Internet access over Inmarsat’s system with the current pricing, and OnAir’s Lake told me the firm had not sought Wi-Fi certification in their current system design.
source : wifinetnews.com
By Glenn Fleishman
They promised June—they delivered June—but I was expecting, well, certified devices, not certification testing: The Wi-Fi Alliance has been asserting since last year that they would have a certification program in place during second quarter 2007 to test Draft N (early 802.11n equipment) devices against Draft 2.0, an expected milestone in IEEE work on the standard. Today’s press release shows they met the mark, but I had naively assumed that there would certified devices on the market in June, too. Alas, not so. The certification program has begun, and new firmware and equipment will be out this summer. How soon, it’s unclear.
The Draft 2.0 compliant firmware that manufacturers have promised, and that chipmakers apparently completed months ago, will likely not be released for the majority of Wi-Fi devices with Draft N until certification is finished for that device in case things need to be fixed.
Draft 2.0 should improve interoperability among devices, and it adds three separate protection mechanism for the 2.4 GHz band to prevent N from using more spectrum than a comparable G device in the presence of other networks. In 5 GHz, where there’s much more “room,” only two of those mechanisms are needed, because interference is much less likely and easier to solve. (See “How Draft N Makes Nice with Neighbors,” an article I wrote after interviewing chipmakers that appeared 2007-02-16.)
Update: The Wi-Fi Alliance told me that certification could take as little as “hours,” and that 20 products were booked for testing on the first day of certification. That’s fine, but I’m interested in the cycle from certification to firmware release. I had really expected that certification results along with updated firmware would occur within the quarter, but I am just too darned optimistic.
Tim Higgins, meanwhile, is not very happy with his testing of pre-certification Draft 2.0 updates from D-Link. They don’t conform to his reasonable interpretation of the co-existence mechanisms for N and earlier B/G devices in the same frequency ranges.
source : wifinetnews.com
Posted Jun 28th 2007 3:44PM by Thomas Ricker
Filed under: Cellphones, Portable Audio, Portable Video
Posted Jun 29th 2007 1:10PM by Darren Murph
Filed under: Cellphones, Wireless
What do you do when your main competitor is about to unleash the most hyped gadget ever to the cell phone-toting masses? Give said gadget a mocking nickname in a memo to employees.
The Thursday letter to Verizon Wireless employees from Chief Operating Officer Jack Plating is delicately titled "iWhatever." In it, Plating acknowledges the incessant iPhone hype and assures employees that AT&T can't match Verizon's service, choice of phones and extras, like music.
"A day hasn't gone by over the past two weeks when someone hasn't asked me what our response to the iPhone is. I meet each of these opportunities with enthusiasm, because the iPhone is simply a response to you and what Verizon Wireless has achieved," he writes.
In his pep talk, he calls AT&T's iPhone "yet another attempt to stay competitive with us," but adds that he's "not ready to dismiss the impact of the iPhone" on Verizon. Moreover, he says, AT&T's data network speeds are half that of Verizon's. And, he says, "a device is only as good as the network it's on, I expect AT&T to ignore the network service completely and focus squarely on the device."
Verizon is not letting AT&T completely dominate the day today--the company said its stores will be open until 9 tonight with special deals on some devices.
source : com.com
I've been a fan of SimplyMEPIS for years. The distribution was one of the early pioneers in the field of user-friendly Linux development, and to this day offers a system that usually "just works." Earlier this month the MEPIS site announced a community variation for older computers based on SimplyMEPIS. AntiX is an installable live CD that features a modern kernel, recent X server, and lighter applications for use on computers with as little as 64MB RAM. I tried it, and liked what I found.
I tested AntiX on my everyday laptop (a Hewlett-Packard dv6105 with a 2.0GHz AMD Turion and 512MB RAM) as well as an older 667MHz Pentium III computer. I was immediately disappointed that a distribution advertised for "antique" computers doesn't ship with (or provide on mirrors) a diskette boot image. Many computers of that target era don't have the capability of booting from the CD-ROM drive. Overlooking that requirement is a major flaw in this distro design.
Despite starting out on the wrong foot, I booted AntiX on my notebook and found myself at a login screen with no clue as to the user and password I needed to log in. Searching for them was a tedious task. AntiX doesn't have its own Web site yet, and all information about it is piecemeal around the SimplyMEPIS site and forum. I found what was reported to be the user and password, but it didn't work. Finally, I was able to use "root" for the user and password and start the tastefully executed Fluxbox window manager.
Like SimplyMEPIS, hardware detection and setup for my laptop was nearly impeccable. The X resolution was optimal, my touchpad and USB mouse functioned smoothly and accurately, and sound worked automagically. Most remarkable was finding my Internet connection available at login. SimplyMEPIS and AntiX are the only two distros that have offered such a convenience for my Ndiswrapper-dependent Ethernet chip.
Under AntiX, removable media is detected by the kernel, but I had to mount the partitions manually. Printer setup is handled through the GNOME CUPS manager. Advanced powersaving features, such as CPUfreq modes and suspend/standby, are available and functional at the command line (with powersave), though there are no corresponding graphical applications. You can easily add battery and CPU temperature monitoring to the desktop application Conky.
The desktop is an uncluttered pleasure. It features only Conky in the upper right corner and a panel at the top. In the menus you'll find ample applications for completing most common tasks, including Abiword and Leafpad for word processing and text editing, the GIMP for image manipulation, Gnumeric for spreadsheets, Firefox, Dillo, and Sylpheed Claws for Internet and communications, Audacity and XMMS for audio enjoyment, Camstream for webcam use, Xine for video playback, X-CD-Roast and Graveman for CD-ROM creation, and much more. System tools include the Synaptic package manager, QTParted, and ROX filer. Configuration utilities include pppconfig, sysvconfig, and a DSL/PPPoE configuration tool. There are a few games as well, such as Xscorch, GnuChess, and Xmahjongg. Under the hood we find Linux 2.6.15-27, Xorg 7.1.1, and GCC 4.0.3. The MEPIS hard drive installer is not listed in the menu, but can be invoked from the command line (at /usr/sbin/minstall).
Performance on my laptop was excellent, but I was curious how AntiX would perform on an older computer. To find out I tracked down the oldest working machine in the family -- a 667MHz Pentium III on an Intel i815 motherboard with 256MB RAM.
Again, hardware detection and auto setup was good. Graphics in that machine are driven by an Nvidia 5900 with an older 17-inch monitor. X started with a resolution of 1024x768. I would have preferred 1280x1024 by default, but I could set that easily through a manual configuration file edit. The generic wheel mouse worked as it should, and the old SoundBlaster PCI 512 card was functional. The system saw my ancient Hewlett-Packard scanner but didn't automatically configure it. It also detected my Ethernet card and loaded the correct module.
Menu and window operations were immediate and responsive. Most applications opened in what seemed like an average timeframe, with a notable exception being Firefox. Subsequent application starts were much faster for some applications, such as Firefox and Graveman. The system performed well during operation; I could barely tell I was working on an older machine. As you can see in the chart below, performance was well above acceptable for this older computer.
AntiX Performance Comparison in Seconds
Task or App | Pentium III | AMD Turion64 |
---|---|---|
Boot | 66 | 48 |
Abiword | 6 | 6 |
The GIMP | 9 | 3 |
Firefox | 21/9 | 12/2 |
Gnumeric | 6 | 4 |
Sylpheed Claws | 6 | 3 |
Audacity | 12 | 4 |
Graveman | 19/4 | 19/1 |
Xine | 5 | 5 |
Xfv file manager | 4 | 3 |
Shutdown | 22 | 16 |
I was impressed and enamored with AntiX. Its understated and attractive theme are excellent window-dressing for the superior performance hidden within. Since it is based on SimplyMEPIS, hardware detection and setup are very good as well. The distro includes the extras necessary for a complete user experience, such as browser plugins and multimedia codecs. I had no trouble with stability or lack of functionality, except with QuickTime movie trailers, which would not play. I found in testing that performance on an older computer wasn't significantly worse than what I experienced on a modern machine.
However, AntiX's lack of a boot diskette image makes the software useless for many in its target audience. In addition, the lack of graphical solutions for powersaving functions might be inconvenient to some users. Another problem is the lack of a centralized Web site or forum for information and help. Hopefully these issue will be addressed in the future.
All in all, AntiX gets a thumbs-up for its progress so far. I hope to see more from the project in the future. AntiX offers a modern and pleasing alternative for keeping those computers productive.
source : linux.com
Qwest proposed Wednesday to create a program that would subsidize high-speed Internet deployment to underserved rural areas that are not cost-effective for companies to service, reports the Denver Post.
The program would be managed by state public utilities commissions and funded by the Universal Service Fund. The USF fund receives about $4 billion annually from a surcharge on the bill of every U.S. customer. About $1 billion of that is funneled to wireless carriers that provide service to rural customers, said Steve Davis, Qwest’s senior vice president of public policy.
Qwest’s 14-state local-phone service territory includes vast rural areas. They want to change the wireless phone subsidy from per-user to per-household, which would cut the amount given to wireless carriers by about $500 million annually. That money would be used for Qwest’s broadband deployment program.
Unlike AT&T, Verizon, Sprint or Clearwire, Qwest does not own cellular frequencies or WiMAX frequencies at 2.5 GHz.
Strategies differ on how to deliver broadband everywhere:
All would like free money from Uncle Sugar — or more precisely a piece of the USF billions.
Currently, most USF funding collected from ratepayers goes to prop up rural twisted pair infrastructure or subsidize duplicative wireless cellular carriers.
Qwest has submitted their proposal to the FCC. Their program would be within the purview of the FCC and wouldn’t require new legislation. According to the Denver Post, the FCC would have to approve the proposal by the end of the year for the program to be launched by Qwest’s goal of fall 2008.
Currently, universal service funds are allocated by state PUC’s to eligible telecommunications carriers. The current proposal would allocate USF subsidies to the lowest bidder in an under-served region. That has the potential to reduce the inefficiencies that plague the current USF system, say proponents.
Under the Qwest program, state utilities commissions would identify regions that need broadband deployment. Companies would submit bids for one-time subsidies to provide the service to residents in those areas.
The USDA’s $1.2 billion Rural Utilities Service program, which is tasked with funding rural broadband deployment, was attacked this May by Congress for not doing anything of the sort. The Washington Post reported that since 2001 more than half the money has gone to metropolitan regions or communities within easy commutes of a mid-size city.
Members of the House committee said the five-year, $1.2 billion Universal Service Fund to provide rural communities with broadband was broken. It missed many unserved areas while channeling hundreds of millions of dollars in subsidized loans to companies in places where service already exists, charged the committee.
VERMONT: RHODE ISLAND: SOUTH CAROLINA: KENTUCKY: OREGON: Source: U.S. Census Bureau, 2005 |
“If you don’t fix this, I guarantee you this committee will,” House Agriculture Committee Chairman Collin C. Peterson (D-Minn.) told James M. Andrew, administrator of the Rural Utilities Service this May. “I don’t know why it should be this hard.”
A study found more than half the funds instead went to urban broadband deployment, and just one out of sixty-nine loans went to wiring a region without any broadband service whatsoever.
The USDA has responded by issuing a new set of proposed rules aimed at making sure the fund is doing what it was originally designed for. The USDA is also pushing to have the program extended until 2012 as part of the 2007 Farm Bill.
Three planned statewide broadand wireless networks in the United States include the states of South Carolina (31,000 square miles), Vermont (9,249 square miles), and Rhode Island (1,044 square miles).
ConnectKentucky is being hailed in Congress as one model for federal high speed internet policy. Currently, 93 percent of Kentucky homes can access broadband, and ConnectKentucky expects every household to be capable of using high-speed Internet by the end of the year. Rep. Rick Boucher (D-Va.) cited the program’s success when he outlined a national plan for universal high speed internet access.
China Telecom to launch AVS-based IPTV trial | |
By Cai Yan | |
| |
Courtesy of EE Times (05/25/2007 7:08 AM EDT) | |
| |
The trials will begin next month in Shanghai, the largest IPTV market in China, using the Audio Video Coding Standard (AVS). It's unknown how big the trial will be. But if Telecom decides to standardize its IPTV platform on AVS, it would be a blow to suppliers of H.264 gear. China Telecom denied the news. However, Huawei Kong, vice director of the Institute of Computing Technology, a leading state-run research group, confirmed the trials will happen. In addition, Leping Wei, chief engineer of China Telecom, indicated last month that Telecom was considering AVS after seeing positive results from tests in Dalian by China Netcom, its rival. China Netcom plans to use AVS-based IPTV in 20 cities by the end of 2007, and hopes for 6 million AVS-based IPTV users in five to seven years, or 40 percent of its current broadband users. However, in a market dominated by H.264, some observers believe it doesn't make much sense to switch to AVS, which may carry higher costs because of its immaturity. However, there is a sense that political pressure rather than business sense is driving the decision. All of China's telcos are state controlled. AVS is among a handful of domestic standards that China is promoting in order to lessen its reliance on foreign intellectual property. If the strategy is successful in the long run, it will shift the flow of royalties and fees into the coffers of local, rather than foreign, companies and help to build up domestic technology. In the first quarter of this year, IPTV users in Shanghai increased to 220,000 according to a report from Analysys International. The researcher said the city's IPTV use shot up 150 percent in the first quarter. IPTV users in China increased to 612,000 in the first quarter, driven 36 percent higher than the quarter before. Market researcher iSuppli Corp. estimates that by 2008 there will be 3.6 million IPTV users in China and by 2010, the figure will be 17.4 million. But those targets may be optimistic. source : digitaltvdesignline.com |
Last week at NXTcomm, AT&T announced the launch of AT&T Video Share, a new service enabling users to share live video over mobile devices while participating in voice calls. According to AT&T, Video Share enables one-way, live streaming video feeds, viewable by both participants in a two-way voice conversation; while touting the service as a means for subscribers to share video from events like weddings and related celebrations, the operator also outlined a series of potential enterprise applications like viewing real estate and facilitating insurance claims.
The first service delivered via AT&T's next-gen IMS network platform, Video Share is now available in Atlanta, Dallas and San Antonio--in late July, the service will expand across the carrier's 3G footprint. AT&T will offer Video Share at prices of $4.99 per month for 25 minutes of usage or $9.99 per month for an hour of usage. Subscribers may also select a pay-as-you-go option at 35 cents per minute.
"Video Share is groundbreaking today as a wireless-to-wireless service, but the potential for the service will expand even further in the near future," said AT&T chairman and CEO Randall Stephenson during his NXTcomm keynote. "Imagine watching television when a notice pops on the screen that a daughter or granddaughter would like to initiate a Video Share call, then immediately switching the television screen to accept the video and audio. With our powerful IP-based network and flexible IMS platform, these scenarios will eventually be reality."
Stephenson also noted that the service would make it's way to the PC screen and television screen very soon. Mobile-to-U-Verse live video feeds? Maybe soon enough.
source : fierceiptv.com
Verizon celebrated the 1 millionth subscriber to its FiOS Internet service last week as well as the 500,000th subscriber to its FiOS TV service. The half-million subscriber mark comes only 20 months after Verizon launched the fiber-based TV service in September 2005. Verizon said that both milestones were achieved during the second quarter of this year.
The 1 millionth subscriber to FiOS Internet, Marjorie Bayer of Massapequa, NY, also has FiOS TV: "You can really see the difference in picture quality from what we had before, on regular channels and especially on high-def," Bayer said. "What we really like is the multi-room DVR, because the kids like to watch shows in their rooms that we've recorded downstairs. Overall, when we came back to Verizon we lost nothing and gained a lot."
source : fierceiptv.com
21 Feb 2007
Filed: Feb 2007, BPL, Press, Co: OPERA, Tech: OPERA 200 Mbps
The OPERA specification is the global solution for Access BPL
Madrid, 21, February, 2007 - OPERA, the Open PLC European Research Alliance for a new generation of Powerline Communications integrated network technology today announced that the European Commission has approved financing of Euro 9.06 million to support extended field deployments based on an open specification for access Broadband over Powerline (BPL) applications.
The specification adopted in OPERA Phase 1 is publicly available on the project web site (www.ist-opera.org).
Over the next two years (2007 - 2008), the objective of OPERA Phase II will be to catalyze the deployment of low cost Broadband access applications over electricity networks for a wide range of applications and use cases that include:
The OPERA specification for BPL access applications adopted in 2006, is based on DS2's 200 Mbps technology and was developed by a consortium of 37 companies that included silicon vendors, equipment vendors, electric utilities, telecom operators and universities from 10 countries. This specification, the only open, global specification for access BPL, generated valuable contributions that have been submitted to the powerline standardisation work underway in both the Institute of Electrical and Electronics Engineers (IEEE) and the European Telecommunications Standards Institute (ETSI). The OPERA specification has also been endorsed by the Universal Powerline Association (UPA), the leading Powerline Communications industry body driving the acceptance of open standards for access BPL and home networking applications over Powerline.
OPERA Phase II will contribute field research to both IEEE and ETSI confirming the importance of the work undertaken in developing and validating the specification for the deployment of access BPL.
The Consortium supporting OPERA Phase II is made up of 26 partners (including 8 universities and research centers) from 11 different countries. The consortium is led by the Spanish utility Iberdrola.
source : ipcf.org
The “third wire” is coming. As some in the industry are calling BPL - Broadband Over Power Line the third wire, PC World scooped the following.
“We are at an inflection point in the industry,” agreed Ralph Vogel, spokesmen for Utility.net, a Los Angeles-based BPL integrator. “Its position is similar to that of DSL in the late 1990s: people have heard of it, and while we were previously not quite there yet with the technology, we are now.”
In fact, one technology consultant in Cincinnati had this to say about his BPL internet connection. ‘”It seems equivalent to standard cable service and a little faster than standard DSL,” he noted. “But the speed is not asynchronous, meaning you get the same speed upstream and downstream.” “I can’t get same bandwidth for any price close to it from another carrier,” he said.’
His Broadband over power line connection runs at a speed of roughly 3M bit/sec. Consider that speed is heading both upstream, and downstream, BPL offers a considerable advantage over cable internet connections that cap the upstream speed significantly.
Add the flexibility of having the internet connection available through any power outlet in his house, and you have a no-brainer for internet hungry consumers. Current estimates project the growth of the current 150,000 customer base, to exceed 2.5 million in just four years. This, according to Chris Rodin, analyst at Parks Associates in Dallas, Texas. Barhorst receives his internet connection from the utility Duke Energy. The internet service is handled by the company, Current Communications, located in Germantown, Maryland.
source : broadbandfocus.com
Canonical announces Details of Ubuntu for Mobile Internet Devices
TAIPEI, Taiwan, June 7, 2007 – Canonical Ltd., the commercial sponsor of Ubuntu, announced more details on Ubuntu Mobile and Embedded Edition at Computex 2007 in Taipei. Following discussions at the Ubuntu Developer Summit in Seville Spain and a great response from the developer community generally, the target specifications and technical milestones for the project have been agreed.
source : ubuntu.com
The Nielsen Company, the longtime monitor of television consumption, is buying Telephia, a private company based in San Francisco that collects data on the cellular market, reports the NY Times.
Telephia tracks consumers’ phone calling, mobile Web surfing, video viewing and other data for advertisers. Nielsen has been building mobile tracking products on its own, but Telephia will greatly advance its ability to track media consumption on every screen, Nielsen executives said.
Telephia’s consumer research is based on surveys of different people each time, in contrast to Nielsen’s practice of monitoring what the same people watch over time, says The Times. But Telephia also holds some patents for media consumption tracking on the cellphone that is similar to Nielsen’s approach in television.
Participation TV industry is one of the hottest areas of the mobile industry, says RCR News. According to Telephia, Americans texted nearly 35 million times in the first quarter of this year, generating roughly $35 million in revenue. Game shows and reality programs like “Deal or No Deal,” “Wheel of Fortune,” “Dancing with the Stars,” and “Hell’s Kitchen” utilize SMS for audience participation — and increased revenue.
Aggregators like SinglePoint deliver tens of thousands of votes/entries per second by telephone, IVR and SMS.
Seattle-based M:Metrics also measures mobile phone useage through surveys and by installing monitoring software on a user’s mobile phone.
InfoSpace says it’s hard to get big companies interested in advertising in the mobile space when so much is unknown. But analysts expect the global mobile content and applications market will be greater than $80B by 2010 and by 2011, more than 3 billion mobile subscribers are projected. Of those, approximately 74% will be mobile data subscribers, contributing 20% of revenue for operators.
Leichtman Research Group, Nielsen/NetRatings, Point Topic and IT Facts have additional statistics.
source : dailywireless.org
Startup SiBEAM today revealed its new WirelessHD chipset, designed to “make wireless multi-gigabit throughput a reality”, in the home.
SiBEAM’s operates in the 60 GHz or ‘millimeter-wave’ unlicensed band, with 7 GHz of frequency bandwidth. Technology competitor UWB operates from 1.5 to 7.5 GHz, while 802.11n uses one or two 20 MHz-wide channels (generally in the 5 GHz band). SiBEAM uses adaptive beam-steering technology that takes advantage of the directional nature of 60 GHz signal.
“Wireless USB“, using UltraWideband and the WiMedia spec, can deliver 480Mbps 10 feet or so, but HDMI and DVI cables deliver data rates in the gigabit range. That’s where Wireless HD comes in.
SiBEAM says the main advantage of unlicensed 60 GHz is that compression chips are not needed for HDTV, since it can handle the gigabit speed required by baseband HDTV (video).
source : dailywireless.org
June 26, 2007
Netgear is working with femtocell maker Ubiquisys on a 3G femtocell with a Wi-Fi access point for resale by mobile carriers next year.
The unnamed product will do everything a DSL modem/Wi-Fi router combo, typically called a residential gateway, will do today, and will also add femtocell service to extend a carrier’s signal into the home. In this case, the device will support UMTS and HSPA (as well as HSUPA and HSDPA) for 3G connections.
Ubiquisys makes the ZoneGate femtocell, which uses DSL for the backhaul connection to the carrier; that technology will be integrated in the Netgear product.
VoIP is also one of the new gateway’s key features. Jay Kim, product line manager for femtocell products at Netgear, says this unit will support both UMA and SIP/IMS.
Despite the arrival dates for this product (sampling this year, with commercial availability by early 2008), Kim says it will initially only support standard 802.11g, not 802.11n -- this despite the fact that the Wi-Fi Alliance this week started actual testing of 11n products for interoperability, months before the 11n standard is ratified. However, that’s not stopping vendors – including Netgear – from pumping out 802.11n products today based on the 2.0 draft of 11n.
Femtocells and UMA are both rivals and compatriots. The two technologies compete in the fact that both are used to extend a mobile carrier’s reach into the home, the former with standard cellular signals, the second using cellular-to-Wi-Fi hand-off. But UMA developer Kineto Wireless – also a Ubiquisys partner – is also working on using UMA as a backhaul solution for femtocells. As the UMAtoday.com blog states, "UMA... is a generic IP access technology that can be used to implement a dual-mode handset (DMH) service with cellular/Wi-Fi phones, but it’s not actually tied specifically to Wi-Fi" -- and UMA "plays a key role in a mass-market femtocell solution."
ABI Research recently said that FMC has an early lead, but that femtocells will skyrocket after 2010; it also ranked Ubiquisys the number one femtocell vendor last week, ahead of ip.access and RadioFrame Networks, based on criteria like “innovation” and “implementation.”
source : wi-fiplanet.com