FTC looks at privacy in the next “Tech-ade”

Written by admin on 08/06/2019 Categories: 杭州夜生活

The Federal Trade Commission has just wrapped up an important conference that was organized to give the agency a better sense of the challenges that technology will pose to consumers in the next 10 years. Unfortunately titled “Protecting Consumers in the Next Tech-ade,” the conference brought together industry leaders and consumer advocates to talk about advertising, RFID, social networking, and user-generated content. Packing a crowd of people into an auditorium to talk about social networking isn’t everyone’s idea of a good time, so the proceedings were livened up with video clips. The conference concluded with one about a “cyber patriot,” a reenactor who “uses cell phones and laptops to convene and reenact the French and Indian War,” according to the official blog of the event. Good times.HangZhou Night Net

The “cyber patriot” aside, the conference focused extensively on consumer privacy and the ways that it can be eroded by new technologies. Marcia Hofmann of the Electronic Frontier Foundation pointed out that “there are few market incentives to protect consumer privacy,” but the advertisers who spoke did not (surprise) see it that way. Most of them believed that there were market incentives in place to pay attention to privacy concerns; one solid consumer backlash over a privacy scandal could be enough to damage a brand.

Joshua Smith, from Intel Research Seattle, showed a video clip demonstrating how RFID can help the elderly and the children who need to spy keep an eye on them. He demonstrated an RFID reader built into a bracelet that can read tags found in household objects such as toothbrushes. If you need to know how many times Grandpa opened his pill bottle, brushed his teeth, or moved from room to room, just check the RFID log. Such technologies have privacy implications, of course, but it’s not clear that consumers are always aware of them.

In closing remarks to the conference, Lydia Parnes, the Director of the Bureau of Consumer Protection said that many consumers are overwhelmed by new technologies that they do not fully understand. While allowing users to opt out of certain types of information-gathering practices is worthwhile, Parnes’ comments suggest that many users remain unaware of what information is collected on them and how it can be used. Giving people choice without context will not automatically provide consumer protection. (This was also the claim made in the recent FTC complaint filed against Microsoft and others in the advertising business.)

The FTC is trying to get a handle on the sorts of issues likely to become a problem in the next decade. At the moment, their Internet enforcement is usually directed at spammers and adware vendors, and it’s not clear whether they’ll have the resources to add new areas of enforcement. The agency has just announced a settlement with Zango, for instance, for $3 million over that company’s adware programs, and has secured another $50,000 from a spammer. Ten years from now, we suspect they’ll still be fighting this battle.

Comments Off on FTC looks at privacy in the next “Tech-ade”

IEEE working on new laptop battery standards

Written by admin on  Categories: 杭州夜生活

The IEEE announced today that it will begin revamping its laptop battery standard, IEEE 1625, in order to meet demands to make laptop batteries more reliable and robust in light of recent safety concerns. The IEEE Standards Association (IEEE-SA) working group is also considering making additions to the IEEE 1625 standard that would help to ensure better compliance.HangZhou Night Net

The current IEEE 1625 standard was approved in 2004 and already defines specific approaches to evaluating and verifying the quality of laptop batteries, as well as "battery pack electrical and mechanical construction, cell chemistries, packaging, pack and cell controls, and overall system considerations." The revision is meant "to further safeguard the reliability of these batteries," according to Edward Rashba, Manager, New Technical Programs at the IEEE-SA, adding that they plan to incorporate lessons learned while developing the IEEE 1725 standard for cell phone batteries.

Rashba says that companies such as Apple, Dell, Gateway, Hewlett-Packard, IBM, Intel, Lenovo, Panasonic, Sanyo, and Sony have all indicated a "strong interest" to participate in developing the revised standard, and that the group ambitiously plans to meet bi-monthly starting on November 15 in both the US and Asia until the standard is completed.

The IEEE-SA task force plans to work on revisions to the current standard over the next 18 months. This would mean that the new IEEE 1625 standard for laptop batteries will likely not come out of the IEEE until sometime in mid-2008, which seems like an awful long time from now. If this is truly the case, the Association Connecting Electronics Industries (IPC)—whose battery standards are expected to be out by the second quarter of 2007—will likely beat the IEEE in developing new standards for laptop batteries. The question is: If there are two separate standards developed, which will the industry choose?

Comments Off on IEEE working on new laptop battery standards

US is a broadband laggard, according to FCC commissioner

Written by admin on  Categories: 杭州夜生活

One of the commissioners on the board of the Federal Communications Commission is taking shots at the state of broadband in the US. In an op-ed piece in yesterday’s Washington Post, Democrat commissioner Michael J. Copps criticized the state of broadband in the US and called for changes in how the FCC handles broadband.HangZhou Night Net

The problem

Part of the problem comes from how the FCC defines broadband penetration. Under the Commission’s guidelines, a zip code is said to be served by broadband if there is a single person within its boundaries with a 200kbps connection. Not only is that a metric that sets the broadband bar incredibly low—less than four times faster than dial-up—but it does nothing to assess how deeply broadband service has penetrated into a given area.

Copps also cites a sad litany of metrics. According to the International Telecommunication Union (ITU), the US sits in 15th place worldwide when it comes to broadband penetration. The US does even worse with another ITU metric, the Digital Opportunity Index (DOI). The DOI measures 11 different variables, including Internet access price, proportion of users online, and proportion of homes with Internet access, and it slots the US in at the number 21 position, right between Estonia and Slovenia (South Korea, Japan, and Denmark top the DOI).

Most US residents stuck with a subpar broadband connections will agree that the biggest problem is competition. We have talked about the cable/DSL duopoly before on Ars, and as Copps points out, duopolies are best-case scenarios at the moment. Many broadband users are stuck with only one option when it comes to broadband.

The solution

According to Copps, "the FCC needs to start working to lower prices and introduce competition." Unfortunately, Copps’ vision of competition doesn’t appear to address what is arguably the biggest problem of all: the FCC’s policy of classifying broadband as an information service, deregulating it, and counting upon competition between modes of delivery (e.g., cable vs. DSL) to save the day. Copps ignores that issue altogether, calling on to FCC to speed up the process of making unlicensed spectrum available and encourage "third pipe" broadband technologies such as BPL (broadband over power lines) and wireless.

Those proposals are a start, but by themselves are not enough to effect significant change in the near term. BPL, despite the periodic attention it gets, has less than 6,000 customers in the US. By and large, utilities have not shown themselves to be interested in getting into the ISP business. Opening up spectrum is no panacea, either. WiMAX deployments are just beginning and will not come cheaply.

Here are a couple of other suggestions. First, cities and towns should be allowed—if not encouraged—to build and deploy broadband networks. This goes beyond rolling out city-wide WiFi networks and into the area of fiber optics. Currently, a handful of states bar municipal governments from rolling their own fiber optic networks. The massive rewrite of telecom legislation that emerged from the Senate Commerce Committee last summer would override such state-level restrictions, allowing any city to get into the game. That bill has been stalled, however, and is unlikely to see the light of day in light of the election results this week.

Another option the FCC should seriously consider is turning back the clock on deregulation. Up until the FCC classified DSL as an information service, ILECs (incumbent local exchange carriers—phone companies) were required to lease their lines to DSL providers at competitive rates. That left consumers with a number of choices for DSL: your local phone company, EarthLink, and Speakeasy to name three. In the wake of deregulation, that is no longer the case. Sure, the ILEC will still lease its lines to other companies, but it will cost them. Cole Reinward, EarthLink’s VP for municipal product strategy and marketing, told Ars Technica earlier this week that the DSL wholesale rates offered to EarthLink by ILECs "weren’t particularly attractive and close to what they are offering retail to subscribers, making it difficult to compete on price."

Copps calls for the kind of "public-private initiatives like those that built the railroad, highway, and telephone systems." That’s an excellent start—just look at the increasing number of municipal WiFi networks being developed via city-company partnerships. But when the private sector drops the ball, the public sector needs to have free rein to step in. Witness Qwest in Seattle: the telecom has refused to deploy fiber in Seattle while at the same time lobbying against the city’s developing its own network.

Let’s also get our metrics straight. Although 200kbps might look good to someone whose only alternative is dial-up, it’s a joke when it comes to broadband. Revise it to something realistic, like 768kbps, which has become the lowest-tiered DSL offering available in most geographies and arguably marks the lower limit of "true" broadband. And make sure we rely on a representative sampling of an area to determine broadband penetration instead of the current single-person-in-a-zip-code model.

It seems like everybody agrees on an essential point: access to "quality," reasonably priced broadband is crucial in this day and age. Unfortunately, we’re not even close in the US. Yes, the nation’s two largest telecoms are at this moment rolling out new fiber optic networks. Better yet, consumers in areas served by Verizon’s new FiOS network are seeing the benefits of increased competition: some cable providers in those areas are bumping speeds up to 15Mbps/1.5Mbps. However, fiber deployments are slow and selective, leaving most Americans out in the cold.

We may be looking at a radically different landscape in five years, with WiMAX, BPL, cable, DSL, and municipal WiFi networks offering consumers a host of equally-good choices. That rosy outcome is by no means guaranteed—there’s much that has to be done in the interim to make it a reality.

Further reading:Washington Post: America’s Internet DisconnectITU: Digital Opportunity IndexFCC rules on fiber, power-line broadband regulationFCC rules on cable lines

Comments Off on US is a broadband laggard, according to FCC commissioner

Macworld shows OS X 10.4.8 improves Rosetta performance

Written by admin on  Categories: 杭州夜生活

While running Mac Office under Rosetta may be a little slower when launching, the individual office applications are nimble enough on a Core Duo or Core 2 Duo Mac, assuming one has plenty of memory. Unfortunately, the same cannot be said for many applications from Adobe. Whether it's Creative Suite, or even the consumer software Photoshop Elements, running under Rosetta is like, well, using a late-model G4 Mac, or even worse a G3—shudder. Amazingly, it could be even worse than that, and it has in fact gotten better since 10.4.8 was released, at least according to James Galbraith at the Macworld Lab. HangZhou Night Net

Intel-based systems saw dramatic gains in Photoshop and Office tests, enough to bump up Speedmark scores by a couple of points. More in-depth testing found speed improvements in both Word and Photoshop ranging from 3 percent to 36 percent. Photoshop numbers saw the biggest jumps, especially on the Mac Pro/Quad 2.66GHz Xeon.

Interestingly, the latest update to OS X offered no improvement—an actual slight decrease—in performance for G5 Macs using Creative Suite. Nonetheless, the Quad G5 easily beats a Quad Mac Pro, but not by as much as one might think, and the new iMac actually ties the G5 iMac in testing under Creative Suite.

To what can one attribute this performance boost for Rosetta? According to the release notes, 10.4.8 "improves the accuracy of Rosetta numerics and addresses Altivec translation issues on Intel-based Macs," whatever that means. While native versions of Mac Office and Creative Suite are ultimately the solution to the problem of performance, it's nice to see Apple not simply waiting for others to solve the problem of emulation for them. Let's hope 10.4.9 offers even better performance.

Comments Off on Macworld shows OS X 10.4.8 improves Rosetta performance

Defense Department funds massive speech recognition and translation program

Written by admin on  Categories: 杭州夜生活

The Defense Advanced Research Projects Agency (DARPA) isn’t known for thinking small, and DARPA has turned its attention (and budget) to a massive task: developing a set of software engines that can transcribe, translate, and summarize both text and speech without training or human intervention. The program, called the Global Autonomous Language Exploitation (GALE), attempts to address the lack of qualified linguists and analysts who know important languages like Mandarin and Arabic.HangZhou Night Net

When bid solicitations went out last year, they told interested parties that DARPA wanted three separate modules built. The first handles the transcription of spoken languages into text. The second is a translation module that can convert foreign text into English, and the third is a “distillation” engine that can answer questions and summarize information provided by the other two modules. While this technology would certainly be put to use by military personnel in the field, it is really designed for deployment in the US, where analysts are easily overwhelmed by the electronic information gathered by the intelligence community.

Most of this information simply goes untranslated, but if GALE is a success, the US government would have access to transcriptions of foreign broadcast news, talk shows, newspaper articles, blogs, e-mails, and telephone conversations. Even with the translation work done, though, this information would be overwhelming, which is why the distillation engine is such an important component of the product.

The project, now more than one year old, has several teams of contractors competing with one another to develop the best software. Those companies are IBM, SRI International, and BBN Technologies, and they are supported by the Linguistic Data Consortium at the University of Pennsylvania. To remain in the program and continue to receive funding, each group must hit performance milestones; DARPA says that the transcription engine must be at least 65 percent accurate and the translation engine must be 75 percent accurate at the first milestone. The final milestone in the program is 95 percent accuracy in both modules.

The Associated Press recently took a look at the BBN team, which has 24 people working on the project. DARPA has already made clear that they will cut any team not meeting performance targets, which would automatically translate into job cuts at a company like BBN. “I cannot entertain that idea right now,” team leader John Makhoul told the AP. “It’s just so drastic that we just don’t think about it.”

As it happened, though, no one was cut after the first performance trials of the system, so research into GALE continues for all three firms. Though all were able to approach or exceed the initial targets, approaching 95 percent accuracy—even for casual discussion in noisy environments—remains a huge challenge. DARPA has a stated goal of “eliminating the need for linguists and analysts,” but that day may still be years away.

Comments Off on Defense Department funds massive speech recognition and translation program

$1 of Zune price goes straight to Universal Music, won’t pass “go”

Written by admin on 08/05/2019 Categories: 杭州夜生活

Microsoft’s Zune media player, due for retail release next week, will have the support of Vivendi Universal in an unusual contract form. For every $249 Zune player sold, Universal will get $1 (subscription required) to make up for the “unauthorized content” the company expects will make its way onto the device.HangZhou Night Net

Universal says that half of the fees collected will be passed on to its stable of recording artists, including U2, Jay-Z, Linkin Park, Luciano Pavarotti, and Bon Jovi. The rest will presumably pad Vivendi’s income statement a bit and make up for some of the lost CD sales revenue the industry bemoans at every opportunity.

Microsoft says it is discussing similar deals with other studios. The Zune has a wireless song-sharing feature that could raise the hackles of music industry executives, and at less than 0.5 percent of the total sale price, Universal’s cut appears rather reasonable. The motivation for it, however, is open for discussion.

“The only factor was that we feel that there’s a great deal of music that’s (stored) on these devices that was never legitimately obtained, and we wanted to get some sort of compensation for what we thought we’re losing,” said Universal Music Group CEO Doug Morris, a previous acquaintance here at Ars. “I want our artists to be paid for the music that makes these devices popular.” He then goes on to lament the fact that end users tend to rip their own CDs onto iPods and other music players.

I’m sorry, but Universal already got paid for that content when the CD was sold, and ripping the songs for use on newfangled digital music players falls squarely under “fair use.” Why should I have to pay Apple for a digital copy of Living on a Prayer when I already own Slippery When Wet? Yet that’s exactly what Morris wants to see. The studios have the right to refuse access to their catalogs for any particular digital music service, and they use that trump card to squeeze every penny they can out of the common consumer.

Comments Off on $1 of Zune price goes straight to Universal Music, won’t pass “go”

Head of ESA: Don’t call us “video gamers”

Written by admin on  Categories: 杭州夜生活

Doug Lowenstein, the head of the Entertainment Software Association, is unhappy with the popular term “video game” being used to describe, well, video games. He believes that using the term “game” belittles the industry, preventing it from being taken seriously.HangZhou Night Net

His suggested replacement terms are “interactive entertainment,” which evokes memories of 1990s CD-based shovelware and the horrid Phillips CD-i, or “entertainment software,” which his organization has already been using for some time with little effect.

Lowenstein’s suggestions are unlikely to have much impact in the popular vernacular. After all, we don’t go around calling movies “celluloid entertainment” or TV shows “entertainment broadcasts.” However, it is worth looking at what prompted the ESA head to make these comments in the first place.

The term “video games” is often used as a pejorative by people with an axe to grind against the industry. It is often said in a dismissive tone, sometimes by people who prefer other forms of entertainment (for example, Roger Ebert’s infamous declaration that video games could never be art). There is also a curious disconnect with members of the older generation, who occasionally get the term “gaming industry” mixed up with the gambling industry, which is ironic as casinos originally adopted the term in order to appear more respectable.

For the younger generation, none of this may seem a pressing issue. But they are not the ones who are trying to ban violent video games, or set up government-controlled organizations to regulate their content. For over a decade, the video game industry has been a popular whipping-boy for politicians eager to capture quick and easy “think of the children!” votes. It is this assault that Lowenstein is hoping to mitigate somewhat by raising people’s opinion of the industry itself.

It may be a long, uphill battle, but in the long term, victory is probably inevitable. As the current politicians age and fade off into the sunset, they will be replaced with people who have never known a world without video games. Like the movie and television industries before them, respect will come by default, as games become just as commonplace.

Comments Off on Head of ESA: Don’t call us “video gamers”

Are our brains a gift of the Neanderthals? (Part I)

Written by admin on  Categories: 杭州夜生活

As many of you that follow science news are probably already aware, one of the hot topics this week is our brains. New evidence suggests that modern humans may have picked up a key genetic change that influences our brains from a species that has an undeserved reputation as being a bit on the dim side: the Neanderthals. But to understand the evidence, some knowledge of evolutionary biology is going to be needed. So read on to learn more than you ever wanted to know about selective sweeps. HangZhou Night Net

The underlying principle of evolution is that a useful genetic change in a gene will, thanks to selective pressures, become the dominant form of that gene in a species. The more useful it is, the faster this will happen. But genes don't exist as free-floating items: they are arranged along chromosomes, immense molecules of DNA that are key units of inheritance. Genes on the same chromosome are likely to be inherited together, so a chromosome carrying a useful change could be pulled into prevalence by that change. That means that a bunch of other useless changes (and possibly a few harmful ones) will also be taken along for the ride. In short, one helpful change can not only make its way to being the most common form, but it will carry along anything else in the DNA that's near it. This process is called a selective sweep.

The one thing that gets in the way of selective sweeps working with an entire chromosome is recombination, the shuffling of the genes that occurs with sexual reproduction. This process allows two of the same type of chromosomes to exchange pieces at random points along their length. Because of recombination, the chromosomes you got from your mother are a patchwork of pieces of the chromosomes of both of your maternal grandparents. Over time, recombination will shuffle the areas near useful changes, and gradually return them to looking pretty average. The longer the time, the more likely that recombination will occur in the region, and the more average looking the area will be. So, over time, the evidence of older selective sweeps will gradually fade into obscurity.

What does this look like at the DNA level? For a gene that's undergone a recent selective sweep, everyone in the population will have an identical (or nearly identical) DNA sequence. But it doesn't stop with the gene; even the areas around it will be identical, because they haven't had time to recombine. The more recent the sweep, the further away from the gene the identical sequences will extend. Thus, by looking for regions of the genome where there is a high degree of identity within a population, researchers can find genes that are being selected for. By looking at how far the identical regions extend, they can get a sense of how long ago the change that's being selected for appeared in the genome.

In essence, the Neanderthal story is based on researchers finding an area of the human genome that appears to have undergone a selective sweep, and figuring out the when, where, and why of its origin. Stay tuned to find out the details.

Comments Off on Are our brains a gift of the Neanderthals? (Part I)

Gates: competitors tried to “castrate” Vista

Written by admin on  Categories: 杭州夜生活

With Vista just released to manufacturing, Bill Gates told reporters in Europe that antitrust concerns had not caused Microsoft to cut features from the operating system. Gates insisted that “the idea that we should make Windows better is a pretty pro-consumer idea,” according to the Wall Street Journal.HangZhou Night Net

Although his company has had discussions for years with regulators, Gates said that none of them ever insisted that specific features be removed from Vista. The main regulator that concerns Microsoft is the European Commission, which has already fined the software giant millions of euros and has entertained visits from many Microsoft rivals. The Commission has repeatedly said that it will not issue an approval to products in advance, since doing so would amount to government censorship. And while the Commission itself may not have asked Microsoft to remove any features, the company was certainly aware of what competitors were making trips to Brussels.

These included several security vendors concerned about the expansion of Microsoft’s security offerings, and Gates had nothing postive to say about his rivals. He told reporters that competitors wanted to “castrate” the new operating system, which begs the obvious question: if Vista gets castrated, does it just become one of many UNIX?

Competitors did convince Microsoft to make some changes, but the company argues that it has not made Vista any less secure as a result. Sven Hallauer, the Director of Program Management at Microsoft, said in a separate statement that “security is top of mind for all who work at Microsoft.” He pointed to the company’s Secure Development Lifecycle plan for writing better code as an example of the process changes that Microsoft has made in order to increase security. Hallauer claims that an analysis of reported problems with Windows XP shows that the majority of them would be eliminated or reduced simply by switching to Vista.

Business users can find out whether the hype matches the reality on November 30, when Microsoft is planning a launch party for its corporate customers. Consumers will have to wait until January 30.

Comments Off on Gates: competitors tried to “castrate” Vista

Are our brains a gift of the Neanderthals? (Part II)

Written by admin on  Categories: 杭州夜生活

With an understanding of selective sweeps out of the way, we can move onto the results presented in a new, open access article about a potential selective sweep. The change involved is in the middle of a gene that's critical for brain development, microcephalin. Loss of this gene reduces the size of the human brain by three- to four-fold, and a specific form of the gene (we'll call it allele D) appears to do something useful for modern brains, as it's present in 70 percent of the modern human populations, with even higher frequencies in some geographic regions (more on that later). Based on the size of the region that's been in on the selective sweep, the D allele seems to have appeared in the human lineage about 37,000 years ago, well after the origin of modern humans 100,000-200,000 years ago. HangZhou Night Net

But the authors note that, if the D allele had been generated by a mutation on a normal chromosome at that time, its sequence should resemble at least some alleles still present in the remaining 30 percent of chromosomes. They don't. After sequencing 30 kilobases from about 200 chromosomes, they found that the typical variation among non-D sequences was about 20-30 bases across this region. But comparing these to the D allele showed that the variation shot up to 70-80 bases. The D allele was so different, it almost looked like it had arrived from outer space—it certainly doesn't appear to have been the product of a traditional selective sweep.

Of course, claiming "aliens did it!" is not a good way to get a paper accepted by PNAS. Since the sequences looked as if they had come from different species, the authors treated them that way, and looked to see if they had a common ancestor. They did, but the big surprise was when: the best estimate was 1.1 million years ago, well before modern humans existed. The authors looked at a series of ways in which two alleles of a gene could have avoided recombining with each other for that sort of stretch of time, but found that none of them fit the data as well as the simplest one: the D allele was hiding out in a separate species entirely, and only found its way back into modern humans thanks to a rare (perhaps singular) case of interbreeding. Its reappearance in modern humans is the product of what's termed introgression; the introgression was then followed by a selective sweep.

Modern humans originated in Africa, but by 37,000 years ago, were spreading around the globe, where they undoubtedly came into contact with the earlier species that had preceded them. Which of these predecessor species did we pick up microcephalin D from? This is where the geography came in. The D allele should be present at the highest rates where it's been around the longest—this happens to be Europe. And 37,000 years ago in Europe would mean the last days of the Neanderthals, a separate species that split from modern humans over 500,000 years ago.

The authors note that people have looked for signs of Neanderthal genes before, but haven't found any. But they point out that, in the absence of extensive interbreeding, small amounts of Neanderthal DNA would likely have been lost to genetic drift over the millenia. It is only the strong advantage provided by microcephalin D that appears to keep it visible. They also note that we shouldn't be surprised that the Neanderthals might have some very useful genes; after all, they were occupying some environments for hundreds of thousands of years before modern humans got there, and had plenty of time to adapt to them.

Comments Off on Are our brains a gift of the Neanderthals? (Part II)