All urchins, all the time

Written by admin on 02/01/2019 Categories: 杭州夜生活

Science has placed a large number of resources regarding the urchin genome, including some free content and links to its subscriber only articles, on a single page. If you wish to follow up on any of this information, that's the place to start. In the mean time, we can dive right in. HangZhou Night Net

As I mentioned earlier, echinoderms such as the sea urchin are among the most distant members of the deuterostomes, a group that includes all vertebrates. How distant? The deuterstome common ancestor dates from the pre-Cambrian, and echinoderms with tube feet and a water-based vascular system appear in the early Cambrian, well over 500 million years ago. Echinoderms we can recognize as having modern features became the dominant group following the great Permian-Triassic extinction 250 million years ago. So, the genome, in some ways, provides a glimpse into the distant past, as it reveals what the ancestor to all vertebrates had in its genome half a billion years ago.

Read on for the full story.

The process of obtaining of the genome itself took advantage of a few interesting new ideas in genome sequencing. Much of the sequence was generated with what's called a "whole genome shotgun," a technique that frequently leaves questions regarding the order of sequences and gaps in the final product. In this case, these problems were corrected by the use of large fragments of the genome cloned into bacterial artificial chromosomes (BACs).

Normally, sequencing BACs is laborious, but the authors developed a technique that sequenced pools of BACs simultaneously, getting the process done in 1/5 the normal time, and at 10 percent of the usual cost. By arranging the BACs on a grid, they could pool DNA from the columns and rows and sequence it in batches; the source of specific sequences could be identified based on the intersection of the pools that shared the same sequences. The information generated by BAC sequencing was then integrated with the whole genome shotgun sequence by computer. The computer programs that assemble genome sequences had to be altered because the sea urchin population appears to be very diverse, with one base difference per every 50, a level that the assembly programs interpreted as sequencing errors. The authors suggest the diversity results from the urchin's reproductive strategy: they simply release eggs and sperm into the currents, limiting the probability of forming local, inbred populations.

Identifying the genes began before the BAC sequencing was complete. Four separate programs scanned the sequence for features typical of eukaryotic genes and similarity to genes in other organisms. Predicted genes were fed into a database that was accessible to a number of experts in sea urchin biology; they went through and evaluated and annotated the predictions, sharing their work on a listserve set up for the purpose. In the end, the sequence yielded a genome of 814 Megabases, carrying over 23,000 genes. The structure of many of these genes were refined by the use of a whole genome tiling array, which also revealed that nearly half of them are expressed during the first few stages of embryogenesis.

Well over 7,000 of the genes are shared with vertebrates, although a number that appear to be unique to the echinoderms were found. Many of these unique genes are involved in the production of the sea urchin's skeleton, which is composed of calcite, unlike the vertebrate skeleton. As none of other deuterostomes have skeletons, it appears that the formation of mineralized tissue has developed twice within the lineage.

Many classes of related proteins are encoded by multiple genes in vertebrates, in part because that lineage appears to have undergone two whole-genome duplications. Echinoderms seem to have kept a more ancestral genome (no duplications), and many of these proteins are present only as a single copy. In some cases, however, urchins have made up for this with smaller duplications of individual genes—one such case are the small GTPases (including Ras and RABs), which mediate many signaling processes and direct the motion of vesicles around the cell.

There's a few types of genes that sea urchins appear to lack entirely. These include part of the cell's skeleton called the intermediate filaments, as well as the integrins and cadherins which link these filaments to the cell's surface. They also lack the proteins that are needed to form gap junctions, the links between cells that allow electric currents to pass between them—this appears to limit the ways in which the nerves of urchins can propagate signals.

The nervous system in general had a few other surprises. It's been kown that each appendage in sea urchins has a local sensory-motor loop which manages its activities; overall coordination is handled by radial nerve bundles. There are no obvious sensory organs. But the genome reveals a huge number of receptors similar to the ones that mammals use to sense odors, balance, and noise, and there are six different light sensing proteins. Expression analysis shows that many of the sensory genes are expressed in the urchin's tube feet, suggesting a previously unrecognized sensory sophistication.

Some of the biggest differences are apparent in the immune system. The basic building blocks of the deuterostome immune system appear to be ancient: nearly all of the transcription factors and signaling molecules used by vertebrates are present in the urchin genome, including key signaling systems like the Interleukins and Tumor Necrosis Factor. But most of the urchin immune response appears to be based on what's called "innate immunity." Innate immunity typically uses a limited number of receptors that recognize a large number of pathogens. In the sea urchin, that limited number has been expanded to take up nearly three percent of its genes.

Sea urchins, like everything other than jawed vertebrates, don't appear to have the "adaptive immunity" arm of the immune system, which relies on antibodies and the T-cell receptor. Intriguingly, however, the raw material for antibodies may be present. The genes that catalyze the DNA rearrangements that produce unique antibodies, termed RAGs, are present in the urchin genome. Stretches of DNA resembling the raw materials of the variable regions in antibodies are there, too, although they lack the sequences that the RAGs need to generate mature antibody genes. These findings suggest that the production of antibodies isn't much of a vertebrate innovation, as was initially thought. The authors even raise the possibility that the RAGs are mediating the production of some other variable immune molecule—we just don't know how to recognize it yet.

It's hard to tell how to wrap a story like this up, because work like this is so far-reaching. It has answered some outstanding questions, it has removed questions that have lingered over the biology, and it has provided some answers about the genetic raw material that our ancestors had to work with over half a billion years ago. One other thing that's worth noting is that the money for this effort was almost certainly justified by suggesting we'd get exactly these sorts of answers. To me, at least, it appears that us taxpayers have had our money well-spent in this case.

Comments Off on All urchins, all the time

CCP and…White Wolf? Okay, I didn’t see that one coming

Written by admin on  Categories: 杭州夜生活

CCP is a great company: from their customer service to their ability to stay focused on their one product (EVE Online) and really serve the community, I've always been impressed with them. Of course, they're a small company with a fanatical online following, so there isn't a lot of news about them that isn't EVE based. Until now. Did I really think I would wake up and find out that they merged with White Wolf? If I put on my analyst hat and sat down for a month or so to see what quirky mergers I could think of, that probably never would have made it on the table. I'm also going to have to admit in this post that in high school I used to play a lot of World of Darkness pen and paper games with my friends. There, my secret is out. I'm getting geekier by the day. I'm also excited about what this means: HangZhou Night Net

CCP is bringing a range of White Wolf's role-playing properties online, while the table-top publisher will develop card games, RPG systems, novels and more based on the EVE Online universe.

The EVE universe is ripe for this sort of thing, but CCP's access to the World of Darkness gets me all sweaty. The idea of online games with White Wolf's take on Vampires,Werewolves, Changelings, and all the other sundry monsters and backstories from the World of Darkness is enough to get any fan going. CCP's dedication in the online gaming world as well as their attention to detail make them a good match to see if any of this IP will work in the gaming world outside of RPG titles like Vampire: The Masquerade. I can't wait to see what these two companies come up with.

White Wolf and CCP…I just can't get over it. This makes me want to get out my old source books, order a large pizza or two, grab a bag of die, and spend a weekend giving those Sabbat punks a what for.

Comments Off on CCP and…White Wolf? Okay, I didn’t see that one coming

Online education continues to grow in higher-ed

Written by admin on  Categories: 杭州夜生活

According to a new report on the state of online learning in U.S. higher education, more students than ever before have been taking online courses. The full report, entitled "Making the Grade: Online Education in the United States 2006," estimates that nearly 3.2 million students took at least one online course during the fall 2005 term, an increase of nearly a million people from the previous year.HangZhou Night Net

Nearly three-quarters of those hopping online to learn are undergraduate students looking to complete either Bachelor’s or Associate’s degrees, while most of the rest are doing work towards a Master’s designation. Among undergrads, the majority of students are in Associate’s programs.

More than 96 percent of schools with more than 15,000 students offer some form of online courses. About two-thirds of the very largest organizations offer complete programs online which purport to allow students to complete nearly all of their degree work remotely. These figures, which have also increased from 2004, show that online education has definitely entered the mainstream as far as higher education is concerned. The overall percentage of schools who identified online education as a critical long-term strategy grew from 49 percent in 2003 to 56 percent in 2005.

Not all the news about online education is positive. Educators still have some concerns about the extra discipline required from online students compared to their in-class counterparts. In general, the report says, teachers believe that it takes more effort to teach a class online than face-to-face. However, the consensus among educators was that evaluation was no more difficult in online courses.

Just as important, the report says that college and university education leaders by and large believe that online education is as good as traditional face-to-face education, with nearly 17 percent saying that it’s actually better. Of course, those same leaders are in charge of developing and ultimately marketing their own online programs, which undoubtedly leads some of them to be bullish in their assessments.

Whatever the case, online education is indeed growing rapidly, but the overwhelming majority of students who use it are supplementing traditional face-to-face education, not replacing it. Will that change? According to the report, educators generally feel that it is the student themselves that are holding back online education, with nearly two-third suggesting that the biggest challenge facing students is their own discipline to complete an online course.

After all, at least with traditional face-to-face education, regular course meetings help to keep students on track, even if they show up to class only to hop on the Wi-Fi and surf the day away.

The study (PDF) was supported by the Alfred P. Sloan Foundation and conducted by the Babson Survey Research Group in partnership with the College Board.

Ken Fisher contributed to this report.

Comments Off on Online education continues to grow in higher-ed

NVIDIA rethinks the GPU with the new GeForce 8800

Written by admin on 08/07/2019 Categories: 杭州夜生活

Today, NVIDIA launched their new graphics processor, the GeForce 8800GTX. The 8800 is getting rave reviews (see the links below), and on many benchmarks it even outperforms two ATI graphics cards in a Crossfire rig. In fact, the 8800GTX (575MHz core clock, 768MB of GDDR3 at 900MHz) spanks the competition handily in spite of the fact that drivers for the GPU are in a pretty green state at the moment. One look at the benchmarks will show you that the 8800 represents a major advance in graphics performance, so you’re probably wondering how NVIDIA pulled it off. In short, the 8800 derives its performance gains from a brand new architecture that’s different in some fundamental ways from previous GPU architectures. HangZhou Night Net

The G80 architecture that powers NVIDIA’s new G8800 graphics processor has been four years in development, and its new-from-the-ground-up design marks a significant departure from previous PC GPUs. The G80 is the first in a coming wave of DirectX 10-compatible cards that offer a unified shader model of the kind that has previously only been available to gamers in the form of the Xbox 360’s ATI-designed Xenos GPU.

I’m still poring over NVIDIA’s presentation materials and trying to get a good handle on exactly what the company has done with the G80, so I’ll only briefly review the major advances of the design from a “big picture” perspective.

Some time ago, NVIDIA hosted a conference call in response to a pair of announcements from ATI and Peakstream in the area of stream processing. The company was coy about what they were up to, and all they’d say is that their project wasn’t like what either Peakstream or AMD/ATI are now doing. The Peakstream and AMD/ATI approaches were, in the words of the NVIDIA spokesperson, “like putting lipstick on a pig”—they were taking the existing GPU architecture and bolting on a software layer of abstraction that’s intended to hide its graphics-specific nature while exposing its stream processing functionality.

NVIDIA countered that they had an announcement in the pipeline about a new type of product that’s different from both a hardware and a software perspective, but they wouldn’t say much more than that.

The G80 as a stream processor

The 8800 is clearly the product they were talking about; it’s actually built from the ground up as a highly multithreaded, general-purpose stream processor, with the GPU functionality layered over it in software. This is the reverse of existing general-purpose GPU (GPGPU) approaches. So with the G80, a programmer can write a stream program in a regular high-level language (HLL) that compiles directly to the stream processor, without the additional overhead that goes along with translating HLL programs into a graphics-specific language like OpenGL’s GLSL.

(If you’re not sure what a “stream processor” is, or if what I just said confuses you, be sure to stop right now and go read this article for background on the concepts I’m covering here.)

Ideally, a program for the G80 would consists of hundreds of stream processing threads running simultaneously on the GPU’s many arrays of tiny, scalar stream processors. These threads could do anything from graphics and physics calculations to medical imaging or data visualization.

Making a fully generalizable stream processor like the G80 required NVIDIA to include a feature that graphics programmers have desired for as long as they’ve wanted a unified shader model: a hardware virtual memory implementation for the GPU that enables seamless access to main memory for GPU-based programs.

Virtual register files and virtual memory

From what I can tell, the G80 has a multilevel parallel data cache (L1 and L2 caches, shared among stream processor clusters) that pages data in and out from main memory, just like a regular processor’s virtual memory implementation. However, the L1 cache level isn’t like a typical CPU’s L1 cache. In fact, it may be more like the local store of Cell’s SPUs. Here’s how it seems to work.

From a higher level, the G80’s execution hardware has eight of what I’ll call “tiles,” for lack of a handy, NVIDIA-supplied word. Each tile consists of two groups of eight stream processors, for a total of 16 stream processors per tile. Each tile also has some thread-fetching hardware for managing instruction flow through the different stream processors, and a chunk of scratch memory that NVIDIA has labeled “L1 cache” in their diagrams.

This “L1 cache” is a “parallel data cache,” and their diagrams suggest that this cache is carved up into sixteen blocks, one for each stream processor.

(Note to the people who do these illustrations for NVIDIA: settle on one color for functional units, one for memory, one for groups of functional blocks, etc. Then, keep the same color in all the diagrams throughout the presentation. So all of the orange in the diagram above should actually be green, and all of the L1 should be orange instead of gray.)

Each of these 16 blocks acts as a giant, 4,000-entry virtual register file for a stream processor in the tile. Each SP can read from and write to this virtual register file as it executes a thread, with the result that the G80 can perform in one pass the kind of inter-element vector arithmetic operations that required multiple passes on previous GPUs.

The hardware also provides support for load/store access between main memory and this virtual register file, which is why it’s a bit like the Cell processor’s local store. Finally, this virtual register file is automatically backed by an L2 and, ultimately, by DRAM. Because the register file contents can be paged out to DRAM, this means that the CPU can access the results of a stream computation with a simple memory read.

Separating out the display hardware

It’s worth noting that the G80 chip, by itself, doesn’t have the hardware necessary to output an image to the display. Instead, it sends output to a separate chip on the graphics card, which has the hardware needed to drive an DVI or analog VGA display. In the case of the G80, this hardware was probably put on a separate chip because the G80 chip itself is already really huge. It could also be the case, though, that they eventually plan to gang together multiple G80 chips onto a single graphics card (after a die shrink to 65nm, probably) and have all of the chips feed data into the same display driver hardware.

This is the kind of thing that AMD/ATI will do when they eventually drop a GPU into a cHT socket. There will be an DVI display driver chip somewhere else on the motherboard, and the GPU will write to that so that its output can be displayed on-screen.


Take a look at the benchmarks in the “Further Reading” section below, and you’ll see that the 8800 has debuted today at the very top of the graphics market. AMD/ATI will be playing catch-up to this (and they will catch up), but this holiday season hardcore gamers will go to sleep on Christmas Eve hoping to wake up to an 8800 under the tree. Hopefully, Santa has $599 laying around for the top-end GTX model.

The 8800 not only offers a leap in graphics performance and image quality, but it also promises to give more indigestion to a certain dedicated physics processing unit company. Programmers will be able to write gameplay-affecting physics code in C, and have it run on the G80.

NVIDIA has more than just gamers in mind for the 8800, though. They’re targeting the same high-performance computing (HPC) markets as the other stream processing solutions from AMD/ATI and Peakstream. It’s worth noting that Intel isn’t resting on its laurels in this regard, either. I eagerly await news of what they’re up to in the area of graphics, CPU/GPU integration, and stream processing.

Further ReadingBeyond3D: NVIDIA G80: Architecture and GPU AnalysisAnandtech: NVIDIA’s GeForce 8800 (G80): GPUs Re-architected for DirectX 10Tom’s Hardware: GeForce 8800: Here Comes the DX10 BoomAMD/ATI and NVIDIA tout new uses for the GPUPeakStream unveils multicore and CPU/GPU programming solution

Comments Off on NVIDIA rethinks the GPU with the new GeForce 8800

Not a drop to drink

Written by admin on  Categories: 杭州夜生活

Another week, another report on the state of the world we live in, and how we manage to use or misuse the resources around us. This time the subject is more fundamental to life than the loss of biodiversity from the oceans or loss of other natural habitats. HangZhou Night Net

The UN Development Program is a multinational effort to improve living conditions around the globe, working with governments, populations and the private sector to build "solutions to global and national development challenges." The UNDP has just released “Beyond scarcity: Power, poverty and the global water crisis." It is a topic that has been alluded to here on NI from time to time, and it's likely that you'll hear more about it in general as populations increase and access to clean fresh water becomes more problematic for too many of the world's population. As is often the case, it is the poorer members of society that end up suffering most from lack of access to clean water.

If a family in the UK spent 3 percent of their family income on water, that would be considered a hardship; in countries like El Salvador or Nicaragua it is not uncommon for the poorest families to spend 10 percent of their income on water bills. A lack of effective infrastructure often means that water has to be purchased from vendors who charge 10 to 20 times more than the local utility would if the infrastructure existed. One significant problem is the amount that nations spend on water infrastructure—on average less than 0.5 percent of GDP—a figure usually dwarfed by military spending. Lessons from the developed world show how access to clean water can have huge impacts in public health—the greatest decline in mortality in US history followed the introduction of water filtration and effective sewerage

The developing world is not just limited by access to clean drinking water though. While the urban slums of Latin America, South Asia and Africa are all subject to deficits in water and sanitation, rural communities also suffer these same deficits, as well as irrigation water for farming:

"The biggest challenge ahead is how to manage water resources faced with competition and climate change to meet rising food needs while protecting the access of poor and vulnerable people," said 2006 HDR lead author Kevin Watkins.

Smallholder farmers already make up the greatest proportion of the world's malnourished, and changing precipitation patterns due to climate change will only exacerbate the problem in coming decades. Sub-Saharan Africa is facing crop losses of up to 25 percent as weather patterns shift.

Water shortages also manifest themselves on the battlefield. During past half-century, there have been almost 40 conflicts over access to shared water sources, the majority of these in the Middle East. Yet even here there is cause for hope. Since 1994, Jordan and Israel have been party to an accord that allows Jordan to store winter run-off in Israel while also giving Israel access to Jordanian wells for irrigation. Despite severe droughts in 1999, the agreement remains intact and in force. The region is still particularly at risk, however, and only Iran and Iraq are not subject to water shortages.

The report makes a number of suggestions for success in meeting development goals on water access:

1. Make water a human right—and mean it: "Everyone should have at least 20 litres of clean water per day and the poor should get it for free," says the Report: While a person in the UK or USA sends 50 litres down the drain each day by simply flushing their toilet, many poor people survive on less than five litres of contaminated water per day, according to HDR research.

2. Draw up national strategies for water and sanitation: Governments should aim to spend a minimum of one percent GDP on water and sanitation, and enhance equity, the authors urge: Water and sanitation suffer from chronic under-funding. Public spending is typically less than 0.5 percent of GDP.

3. Increased international aid: The Report calls for an extra US$3.4 billion to $4 billion annually: Development assistance has fallen in real terms over the past decade, but to bring the MDG on water and sanitation into reach, aid flows will have to double, says the Report.

Comments Off on Not a drop to drink

Jakob Nielsen says Apple study is bunk

Written by admin on  Categories: 杭州夜生活

Dr. Jakob Nielsen, world renown expert on usability, has gotten fed up with the coverage of Apple's "study" saying that larger monitors increase productivity (PDF). He doesn't have anything against large monitors per se, but he says that Apple's methodology for performing such a "study" is way out of whack with what is normally done during a usability study: HangZhou Night Net

A prominent article about Apple's study reports, for example, that "cutting and pasting cells from Excel spreadsheets resulted in a 51.31% productivity gain — a task that took 20.7 seconds on the larger monitor versus 42.6 seconds on the smaller screen."


Apple's study focused at the wrong level of work. Pasting spreadsheet cells is not a user task, it's an operation at a low interaction level. More meaningful productivity has to be measured at a higher level, where users string together a sequence of operations to achieve their real-world goals.

Indeed, when I was in college and was tasked with performing a usability study on 3D software, we weren't allowed to tell our subjects to just do one task (move the ball from here to here on the screen) and see how good they performed at it. We told them to animate a ball bouncing by using any and all tools available to them in the software, or any other software on the computer for that matter, and measured their ability to get the job done in a string of unspecified tasks.

The distinction between operations and tasks is important in application design because the goal is to optimize the user interface for task performance, rather than sub-optimize it for individual operations.

Nielsen also criticizes Apple for testing users using "rote, low-level operations that they'd trained on repeatedly until they got them exactly right," making them very skilled at the tasks they were being tested in instead of testing users based on realistic use. Extremely skilled performance is not something that happens a majority of the time for anyone using the computer, he says.

Skilled performance almost never happens on the Web, because users constantly encounter new pages; that is, they spend most of their time pondering options and trying to understand the content that's being presented.


Even in applications, skilled performance is rare because modern office workers typically move between many different tasks and screens.

So how would he have conducted the study?

-Involve a broad spectrum of representative users (not just experts).
-Have the users perform representative tasks (not just a few low-level operations).
-Don't tell users how to do the tasks; observe their real behavior.

Don't mess with Jakob Nielsen. It seems that the Apple study made a number of mistakes that even undergraduate students in usability classes (*ahem*) are taught not to do when conducting usability studies. Did they do it all for marketing purposes, so that they could sell more gigantic 30-inch monitors? Probably. However, maybe if they consulted a few experts and did it "right," the result may have been the same and they could still sell plenty of those gigantic 30-inchers.

Comments Off on Jakob Nielsen says Apple study is bunk

Calling all Macs Running Windows: Do the OSX.MachArena!

Written by admin on  Categories: 杭州夜生活

I'm going to start this entry with a mea culpa. To be fair to Symantec, they didn't put out a press release about OSX.Macarena. They answered the questions of a media outlet which contacted them. That particular media outlet deserves the blame for trying to make a mountain out of a molehill—or a news story where there really was not enough information to do anything more than spread fear, uncertainty, and doubt. I am sure that responsible media outlets spreading FUD bugs the rest of you just as much as it bugs me. The underlying problem of whether the mac-o-sphere trusts anti-virus companies not to cry wolf is a whole 'nother issue, and one that will have to be dealt with at the critical point when a virus appears and is a real threat. That's a tough nut to crack. HangZhou Night Net

Today, there is more information on OSX.Macarena, this time from Intego, which, by the way, did send out a press release for their November 6 Security Memo. It's interesting to note that this is not the "Macarena" virus, it's the "Mach Arena" virus. The risk remains very low, but there's more information about the source of the virus, its effects and transmission methods. A wise move on the part of Intego, if you ask me. Information smashes FUD flat.

The upshot is OSX.MachArena only infects only mach-o binaries. In my limited experience, mach-o binaries are generally Terminal related—command line programs, or dynamicly linked libraries of code (my experience with mach-o is very limited so if someone else can explain better what is and is not vulnerable, please feel free to comment). Only mach-o binaries in the same folder as an infected executable can be infected, which severely limits possible outbreaks. PPC and Universal applications are immune.

The virus spreads from a Mac's Windows installation. Users of both BootCamp and Parallels Desktop can be infected. We don't have to warn y'all that if you run Windows on your Mac, you open yourself up to a host of Windows virus and security problems, do we? We didn't think so.

Will makers of Windows virus software (some of which is free) protect against possible infection from the Windows side of the problem? There is a distinct possibility that this could be seen as a "mac-only" problem and thus not worth their time. And will Apple and the various virtualization providers will create technical barriers to virus transmission from Windows installations to their Mac hosts? This is a big open window in a Mac's armor. It might be nice to see some bars and security glass.

Comments Off on Calling all Macs Running Windows: Do the OSX.MachArena!

Shocker of the year: TIME’s Gadget of the Year list includes Apple, twice

Written by admin on  Categories: 杭州夜生活

Ahh, TIME Magazine, you Apple whores aficionados, you. With TIME constantly giving Apple their Gadget of the Week awards, it's probably no surprise that Apple products have made TIME's Gadget of the Year list for 2006—not once, but twice. HangZhou Night Net

The two products that made the Gadget of the Year list are the coveted MacBook Pro:

This 'iMac on wheels' has built-in iSight camera and remote for Front Row media manager. It features illuminated keyboard and brighter screen plus magnetic breakaway power cord. With the Intel Core Duo it's easy to watch the highest-definition QuickTime movie trailers.

…and the Nike+iPod Sport Kit:

This tiny runner's aid measures your pace from inside a specially designed shoe. Data streams to your iPod Nano which keeps time and calculates caloric burn. Select a distance or time for your workout (plus music) and a voice prompts you at intervals and when you're done you dock and upload the data and track your performance. Set goals for yourself or challenge others. Battery lasts about 1,000 hours — that's an hour a day for nearly three years.

Other gadgets on the list include the Nintendo DS Lite, a Garmin StreetPilot, and the Palm Treo 700w (among others).

Once you go through the whole slideshow, viewers have the opportunity to vote on what they think is the official Gadget of the Year. So far, it looks like the MacBook Pro is gunning it out against the DS Lite for the #1 spot (the DS Lite is currently leading the MacBook Pro by 4 percentage points), and all of the others are getting paltry results. Not to sway the vote or anything, but I voted for the Nike+iPod Sport Kit. 😉

Comments Off on Shocker of the year: TIME’s Gadget of the Year list includes Apple, twice

FTC looks at privacy in the next “Tech-ade”

Written by admin on 08/06/2019 Categories: 杭州夜生活

The Federal Trade Commission has just wrapped up an important conference that was organized to give the agency a better sense of the challenges that technology will pose to consumers in the next 10 years. Unfortunately titled “Protecting Consumers in the Next Tech-ade,” the conference brought together industry leaders and consumer advocates to talk about advertising, RFID, social networking, and user-generated content. Packing a crowd of people into an auditorium to talk about social networking isn’t everyone’s idea of a good time, so the proceedings were livened up with video clips. The conference concluded with one about a “cyber patriot,” a reenactor who “uses cell phones and laptops to convene and reenact the French and Indian War,” according to the official blog of the event. Good times.HangZhou Night Net

The “cyber patriot” aside, the conference focused extensively on consumer privacy and the ways that it can be eroded by new technologies. Marcia Hofmann of the Electronic Frontier Foundation pointed out that “there are few market incentives to protect consumer privacy,” but the advertisers who spoke did not (surprise) see it that way. Most of them believed that there were market incentives in place to pay attention to privacy concerns; one solid consumer backlash over a privacy scandal could be enough to damage a brand.

Joshua Smith, from Intel Research Seattle, showed a video clip demonstrating how RFID can help the elderly and the children who need to spy keep an eye on them. He demonstrated an RFID reader built into a bracelet that can read tags found in household objects such as toothbrushes. If you need to know how many times Grandpa opened his pill bottle, brushed his teeth, or moved from room to room, just check the RFID log. Such technologies have privacy implications, of course, but it’s not clear that consumers are always aware of them.

In closing remarks to the conference, Lydia Parnes, the Director of the Bureau of Consumer Protection said that many consumers are overwhelmed by new technologies that they do not fully understand. While allowing users to opt out of certain types of information-gathering practices is worthwhile, Parnes’ comments suggest that many users remain unaware of what information is collected on them and how it can be used. Giving people choice without context will not automatically provide consumer protection. (This was also the claim made in the recent FTC complaint filed against Microsoft and others in the advertising business.)

The FTC is trying to get a handle on the sorts of issues likely to become a problem in the next decade. At the moment, their Internet enforcement is usually directed at spammers and adware vendors, and it’s not clear whether they’ll have the resources to add new areas of enforcement. The agency has just announced a settlement with Zango, for instance, for $3 million over that company’s adware programs, and has secured another $50,000 from a spammer. Ten years from now, we suspect they’ll still be fighting this battle.

Comments Off on FTC looks at privacy in the next “Tech-ade”

IEEE working on new laptop battery standards

Written by admin on  Categories: 杭州夜生活

The IEEE announced today that it will begin revamping its laptop battery standard, IEEE 1625, in order to meet demands to make laptop batteries more reliable and robust in light of recent safety concerns. The IEEE Standards Association (IEEE-SA) working group is also considering making additions to the IEEE 1625 standard that would help to ensure better compliance.HangZhou Night Net

The current IEEE 1625 standard was approved in 2004 and already defines specific approaches to evaluating and verifying the quality of laptop batteries, as well as "battery pack electrical and mechanical construction, cell chemistries, packaging, pack and cell controls, and overall system considerations." The revision is meant "to further safeguard the reliability of these batteries," according to Edward Rashba, Manager, New Technical Programs at the IEEE-SA, adding that they plan to incorporate lessons learned while developing the IEEE 1725 standard for cell phone batteries.

Rashba says that companies such as Apple, Dell, Gateway, Hewlett-Packard, IBM, Intel, Lenovo, Panasonic, Sanyo, and Sony have all indicated a "strong interest" to participate in developing the revised standard, and that the group ambitiously plans to meet bi-monthly starting on November 15 in both the US and Asia until the standard is completed.

The IEEE-SA task force plans to work on revisions to the current standard over the next 18 months. This would mean that the new IEEE 1625 standard for laptop batteries will likely not come out of the IEEE until sometime in mid-2008, which seems like an awful long time from now. If this is truly the case, the Association Connecting Electronics Industries (IPC)—whose battery standards are expected to be out by the second quarter of 2007—will likely beat the IEEE in developing new standards for laptop batteries. The question is: If there are two separate standards developed, which will the industry choose?

Comments Off on IEEE working on new laptop battery standards

US is a broadband laggard, according to FCC commissioner

Written by admin on  Categories: 杭州夜生活

One of the commissioners on the board of the Federal Communications Commission is taking shots at the state of broadband in the US. In an op-ed piece in yesterday’s Washington Post, Democrat commissioner Michael J. Copps criticized the state of broadband in the US and called for changes in how the FCC handles broadband.HangZhou Night Net

The problem

Part of the problem comes from how the FCC defines broadband penetration. Under the Commission’s guidelines, a zip code is said to be served by broadband if there is a single person within its boundaries with a 200kbps connection. Not only is that a metric that sets the broadband bar incredibly low—less than four times faster than dial-up—but it does nothing to assess how deeply broadband service has penetrated into a given area.

Copps also cites a sad litany of metrics. According to the International Telecommunication Union (ITU), the US sits in 15th place worldwide when it comes to broadband penetration. The US does even worse with another ITU metric, the Digital Opportunity Index (DOI). The DOI measures 11 different variables, including Internet access price, proportion of users online, and proportion of homes with Internet access, and it slots the US in at the number 21 position, right between Estonia and Slovenia (South Korea, Japan, and Denmark top the DOI).

Most US residents stuck with a subpar broadband connections will agree that the biggest problem is competition. We have talked about the cable/DSL duopoly before on Ars, and as Copps points out, duopolies are best-case scenarios at the moment. Many broadband users are stuck with only one option when it comes to broadband.

The solution

According to Copps, "the FCC needs to start working to lower prices and introduce competition." Unfortunately, Copps’ vision of competition doesn’t appear to address what is arguably the biggest problem of all: the FCC’s policy of classifying broadband as an information service, deregulating it, and counting upon competition between modes of delivery (e.g., cable vs. DSL) to save the day. Copps ignores that issue altogether, calling on to FCC to speed up the process of making unlicensed spectrum available and encourage "third pipe" broadband technologies such as BPL (broadband over power lines) and wireless.

Those proposals are a start, but by themselves are not enough to effect significant change in the near term. BPL, despite the periodic attention it gets, has less than 6,000 customers in the US. By and large, utilities have not shown themselves to be interested in getting into the ISP business. Opening up spectrum is no panacea, either. WiMAX deployments are just beginning and will not come cheaply.

Here are a couple of other suggestions. First, cities and towns should be allowed—if not encouraged—to build and deploy broadband networks. This goes beyond rolling out city-wide WiFi networks and into the area of fiber optics. Currently, a handful of states bar municipal governments from rolling their own fiber optic networks. The massive rewrite of telecom legislation that emerged from the Senate Commerce Committee last summer would override such state-level restrictions, allowing any city to get into the game. That bill has been stalled, however, and is unlikely to see the light of day in light of the election results this week.

Another option the FCC should seriously consider is turning back the clock on deregulation. Up until the FCC classified DSL as an information service, ILECs (incumbent local exchange carriers—phone companies) were required to lease their lines to DSL providers at competitive rates. That left consumers with a number of choices for DSL: your local phone company, EarthLink, and Speakeasy to name three. In the wake of deregulation, that is no longer the case. Sure, the ILEC will still lease its lines to other companies, but it will cost them. Cole Reinward, EarthLink’s VP for municipal product strategy and marketing, told Ars Technica earlier this week that the DSL wholesale rates offered to EarthLink by ILECs "weren’t particularly attractive and close to what they are offering retail to subscribers, making it difficult to compete on price."

Copps calls for the kind of "public-private initiatives like those that built the railroad, highway, and telephone systems." That’s an excellent start—just look at the increasing number of municipal WiFi networks being developed via city-company partnerships. But when the private sector drops the ball, the public sector needs to have free rein to step in. Witness Qwest in Seattle: the telecom has refused to deploy fiber in Seattle while at the same time lobbying against the city’s developing its own network.

Let’s also get our metrics straight. Although 200kbps might look good to someone whose only alternative is dial-up, it’s a joke when it comes to broadband. Revise it to something realistic, like 768kbps, which has become the lowest-tiered DSL offering available in most geographies and arguably marks the lower limit of "true" broadband. And make sure we rely on a representative sampling of an area to determine broadband penetration instead of the current single-person-in-a-zip-code model.

It seems like everybody agrees on an essential point: access to "quality," reasonably priced broadband is crucial in this day and age. Unfortunately, we’re not even close in the US. Yes, the nation’s two largest telecoms are at this moment rolling out new fiber optic networks. Better yet, consumers in areas served by Verizon’s new FiOS network are seeing the benefits of increased competition: some cable providers in those areas are bumping speeds up to 15Mbps/1.5Mbps. However, fiber deployments are slow and selective, leaving most Americans out in the cold.

We may be looking at a radically different landscape in five years, with WiMAX, BPL, cable, DSL, and municipal WiFi networks offering consumers a host of equally-good choices. That rosy outcome is by no means guaranteed—there’s much that has to be done in the interim to make it a reality.

Further reading:Washington Post: America’s Internet DisconnectITU: Digital Opportunity IndexFCC rules on fiber, power-line broadband regulationFCC rules on cable lines

Comments Off on US is a broadband laggard, according to FCC commissioner

Macworld shows OS X 10.4.8 improves Rosetta performance

Written by admin on  Categories: 杭州夜生活

While running Mac Office under Rosetta may be a little slower when launching, the individual office applications are nimble enough on a Core Duo or Core 2 Duo Mac, assuming one has plenty of memory. Unfortunately, the same cannot be said for many applications from Adobe. Whether it's Creative Suite, or even the consumer software Photoshop Elements, running under Rosetta is like, well, using a late-model G4 Mac, or even worse a G3—shudder. Amazingly, it could be even worse than that, and it has in fact gotten better since 10.4.8 was released, at least according to James Galbraith at the Macworld Lab. HangZhou Night Net

Intel-based systems saw dramatic gains in Photoshop and Office tests, enough to bump up Speedmark scores by a couple of points. More in-depth testing found speed improvements in both Word and Photoshop ranging from 3 percent to 36 percent. Photoshop numbers saw the biggest jumps, especially on the Mac Pro/Quad 2.66GHz Xeon.

Interestingly, the latest update to OS X offered no improvement—an actual slight decrease—in performance for G5 Macs using Creative Suite. Nonetheless, the Quad G5 easily beats a Quad Mac Pro, but not by as much as one might think, and the new iMac actually ties the G5 iMac in testing under Creative Suite.

To what can one attribute this performance boost for Rosetta? According to the release notes, 10.4.8 "improves the accuracy of Rosetta numerics and addresses Altivec translation issues on Intel-based Macs," whatever that means. While native versions of Mac Office and Creative Suite are ultimately the solution to the problem of performance, it's nice to see Apple not simply waiting for others to solve the problem of emulation for them. Let's hope 10.4.9 offers even better performance.

Comments Off on Macworld shows OS X 10.4.8 improves Rosetta performance

Defense Department funds massive speech recognition and translation program

Written by admin on  Categories: 杭州夜生活

The Defense Advanced Research Projects Agency (DARPA) isn’t known for thinking small, and DARPA has turned its attention (and budget) to a massive task: developing a set of software engines that can transcribe, translate, and summarize both text and speech without training or human intervention. The program, called the Global Autonomous Language Exploitation (GALE), attempts to address the lack of qualified linguists and analysts who know important languages like Mandarin and Arabic.HangZhou Night Net

When bid solicitations went out last year, they told interested parties that DARPA wanted three separate modules built. The first handles the transcription of spoken languages into text. The second is a translation module that can convert foreign text into English, and the third is a “distillation” engine that can answer questions and summarize information provided by the other two modules. While this technology would certainly be put to use by military personnel in the field, it is really designed for deployment in the US, where analysts are easily overwhelmed by the electronic information gathered by the intelligence community.

Most of this information simply goes untranslated, but if GALE is a success, the US government would have access to transcriptions of foreign broadcast news, talk shows, newspaper articles, blogs, e-mails, and telephone conversations. Even with the translation work done, though, this information would be overwhelming, which is why the distillation engine is such an important component of the product.

The project, now more than one year old, has several teams of contractors competing with one another to develop the best software. Those companies are IBM, SRI International, and BBN Technologies, and they are supported by the Linguistic Data Consortium at the University of Pennsylvania. To remain in the program and continue to receive funding, each group must hit performance milestones; DARPA says that the transcription engine must be at least 65 percent accurate and the translation engine must be 75 percent accurate at the first milestone. The final milestone in the program is 95 percent accuracy in both modules.

The Associated Press recently took a look at the BBN team, which has 24 people working on the project. DARPA has already made clear that they will cut any team not meeting performance targets, which would automatically translate into job cuts at a company like BBN. “I cannot entertain that idea right now,” team leader John Makhoul told the AP. “It’s just so drastic that we just don’t think about it.”

As it happened, though, no one was cut after the first performance trials of the system, so research into GALE continues for all three firms. Though all were able to approach or exceed the initial targets, approaching 95 percent accuracy—even for casual discussion in noisy environments—remains a huge challenge. DARPA has a stated goal of “eliminating the need for linguists and analysts,” but that day may still be years away.

Comments Off on Defense Department funds massive speech recognition and translation program