Free as Freedom, not Free as Free Pizza!” />

-

Udruženje Informatičara BiH - Cross-platform and Portable Development With PHP, LAMP/WAMP development, AJAX and Javascript, ASP, dot NET, C sharp, C++, C, VB, Oracle, tutorials and tips...
Free as Freedom, not Free as Free Pizza!


 
Web http://www.uibih.co.nr

Add to Google

Tuesday, December 19, 2006

Application Overload

Excess software -- redundant applications, unused modules in large suites, and plain old shelfware -- can eat up a big portion of your IT budget. Here are four strategies for putting your software inventory on a diet.

[more]

Assessments Are Opportunities

Opinion: Baseline reviews, audits and other assessments are opportunities to reevaluate direction, change strategy and enter new markets, says Bart Perkins. Don't let them go to waste.

[more]

What’s Keeping the Tort Lawyers at Bay

Opinion: Mark Willoughby explains why predictions that security breaches would trigger a flood of liability lawsuits haven’t come true.

[more]

Tooling Around Town

Zipcar member Nicole Francis likes the simplicity of the car-sharing company's technology.

[more]

Cutting the Waste

Tips for negotiating software deals without getting sold a lot of features you don't really need.

[more]

Seven Paradoxes of The IT World

Opinion: David Moschella contemplates seven paradoxes evident in IT today.

[more]

Buyers beware: Protect yourself from counterfeit technology products

It's more prevalent than you might think.

[more]

Here’s How

If you’re going to break an IT management rule, here’s how to go about it.

[more]

On Age and Attitude (2 letters)

Letters: Readers find age to be a state of mind.

[more]

WPF Isn’t There Yet

Letter: Microsoft’s development tool is where .Net was five years ago.

[more]

SSO? Still One Too Many Passwords

Letter: Single sign-on doesn’t eliminate passwords.

[more]

Taking Stock of the E-voting Situation (2 letters)

Letters: E-voting systems’ source code should be public.

[more]

Software Blamed for Voting Woes

A scathing report by outside consultants blames lengthy voting delays in Denver last month on a “fundamentally flawed” application that was designed to help poll workers check in voters.

[more]

Counties Work to Hide Data

Increasing concerns about identity theft are driving a growing number of counties to start redacting personal data from public records on their Web sites. But the work remains challenging.

[more]

Payroll Problems Plague $25M PeopleSoft Rollout

Problems with a PeopleSoft ERP rollout in Palm Beach County, Fla., have caused significant payroll problems for numerous school district workers.

[more]

Santa’s on His Way

Frankly Speaking: Frank Hayes relates a chat Santa had with an elf about annual gift giving to the IT world. And what's with all the penguins? Santa wants to know.

[more]

Selloff Plans, Fraud Probe Put Spotlight on Siemens

The enterprise networks unit at Siemens announced a line of desktop phones as the company continues to try to find a buyer for the operation and faces an investigation of German workers at its companion telecommunications equipment unit.

[more]

At Deadline Briefs

Short, late-breaking IT news items.

[more]

Wireless LANs Reach Round 2

Many IT managers plan to expand the uses of their wireless LANs by adding technologies such as dual-mode mobile phones. But some companies may be held back by concerns about upgrade costs and security.

[more]

Dual-Mode Phones Nourish Food Distributor’s Sales

A fruit and produce distributor in Chicago is using dual-mode mobile phones to enable workers to talk via a wireless LAN inside its warehouse and a cellular network outside the building.

[more]

Worm may be spreading via Skype chat

Experts are not in agreement about reports that a password-stealing worm may be spreading via the popular Skype VoIP service, but they're closely studying reports of the spreading infection.

[more]

We ? data wizards

By Mark Heynen, Strategic Partner Development



One of the things I most enjoy about my job is interacting with small and medium-sized technology companies. It's common wisdom that such businesses are exceptionally dynamic and innovative, and my interactions with some of them over the past several months supports this view.



Google Base offers a unique opportunity for these companies, particularly in the real estate industry. Several content providers are eager to gain more exposure online and have rich databases that are a great fit for Google Base --but they don't have the resources to provide data feeds reliably. Here a data wizard can become a marketing consigliere -- cleaning data, preparing feeds, and advising on how a content provider should present data online. We've even created multi-client accounts and customized sign-on pages to make this as easy as possible. Finally, our API enables vendors to embed Base uploads directly into their client extranets.



In real estate, these folks range from companies that build websites using the IDX standard to companies that deal purely in data distribution to the Multiple Listing Services (MLS) that house the data for the industry. Here's the story from one company , Point2. Undoubtedly there are many other stories still to be told.



If you'd like to learn more about how a vendor or developer in the real estate arena can work with us, please visit http://www.google.com/baseforidx or contact us via email at postmylistings@google.com.
[more]

SES Chicago - Using Images

We all had a great time at SES Chicago last week, answering questions and getting feedback.



One of the sessions I participated in was Images and Search Engines, and the panelists had great information about using images on your site, as well as on optimizing for Google Image search.



Ensuring visitors and search engines know what your content is about

Images on a site are great -- but search engines can't read them, and not all visitors can. Make sure your site is accessible and can be understood by visitors viewing your site with images turned off in their browsers, on mobile devices, and with screen readers. If you do that, search engines won't have any trouble. Some things that you can do to ensure this:



  • Don't put the bulk of your text in images. It may sound simple, but the best thing you can do is to put your text into well, text. Reserve images for graphical elements. If all of the text on your page is in an image, it becomes inaccessible.
  • Take advantage of alt tags for all of your images. Make sure the alt text is descriptive and unique. For instance, alt text such as "picture1" or "logo" doesn't provide much information about the image. "Charting the path of stock x" and "Company Y" give more details.
  • Don't overload your alt text. Be descriptive, but don't stuff it with extra keywords.
  • It's important to use alt text for any image on your pages, but if your company name, navigation, or other major elements of your pages are in images, alt text becomes especially important. Consider moving vital details to text to ensure all visitors can view them.
  • Look at the image-to-text ratio on your page. How much text do you have? One way of looking at this is to look at your site with images turned off in your browser. What content can you see? Is the intent of your site obvious? Do the pages convey your message effectively?


Taking advantage of Image search

The panelists pointed out that shoppers often use Image search to see the things they want to buy. If you have a retail site, make sure that you have images of your products (and that they can be easily identified with alt text, headings, and textual descriptions). Searchers can then find your images and get to your site.



One thing that can help your images be returned for results in Google Image search is opting in to enhanced image search in webmaster tools. This enables us to use your images in the Google Image Labeler, which harnesses the power of the community for adding metadata to your images.



Someone asked if we have a maximum number of images per site that we accept for the Image Labeler. We don't. You can opt in no matter how many, or how few, images your site has.
[more]

Building link-based popularity

Late in November we were at SES in Paris, where we had the opportunity to meet some of the most prominent figures in the French SEO and SEM market. One of the issues that came up in sessions and in conversations was a certain confusion about how to most effectively increase the link-based popularity of a website. As a result we thought it might be helpful to clarify how search engines treat link spamming to increase a site´s popularity.



This confusion lies in the common belief that there are two ways for optimizing the link-based popularity of your website: Either the meritocratic and long-term option of developing natural links or the risky and short-term option of non-earned backlinks via link spamming tactics such as buying links. We've always taken a clear stance with respect to manipulating the PageRank algorithm in our Quality Guidelines. Despite these policies, the strategy of participating in link schemes might have previously paid off. But more recently, Google has tremendously refined its link-weighting algorithms. We have more people working on Google's link-weighting for quality control and to correct issues we find. So nowadays, undermining the PageRank algorithm is likely to result in the loss of the ability of link-selling sites to pass on reputation via links to other sites.



Discounting non-earned links by search engines opened a new and wide field of tactics to build link-based popularity: Classically this involves optimizing your content so that thematically-related or trusted websites link to you by choice. A more recent method is link baiting, which typically takes advantage of Web 2.0 social content websites. One example of this new way of generating links is to submit a handcrafted article to a service such as http://digg.com. Another example is to earn a reputation in a certain field by building an authority through services such as http://answers.yahoo.com. Our general advice is: Always focus on the users and not on search engines when developing your optimization strategy. Ask yourself what creates value for your users. Investing in the quality of your content and thereby earnin g natural backlinks benefits both the users and drives more qualified traffic to your site.



To sum up, even though improved algorithms have promoted a transition away from paid or exchanged links towards earned organic links, there still seems to be some confusion within the market about what the most effective link strategy is. So when taking advice from your SEO consultant, keep in mind that nowadays search engines reward sweat-of-the-brow work on content that bait natural links given by choice.



In French / en Francais



Liens et popularité.

[Translated by] Eric et Adrien, l?équipe de qualité de recherche.



Les 28 et 29 Novembre dernier, nous étions à Paris pour assister à SES. Nous avons eu la chance de rencontrer les acteurs du référencement et du Web marketing en France. L?un des principaux points qui a été abordé au cours de cette conférence, et sur lequel il règne toujours une certaine confusion, concerne l?utilisation des liens dans le but d?augmenter la popularité d?un site. Nous avons pensé qu?il serait utile de clarifier le traitement réservé aux liens spam par les moteurs de recherche.



Cette confusion vient du fait qu?un grand nombre de personnes pensent qu?il existe deux manières d?utiliser les liens pour augmenter la popularité de leurs sites. D?une part, l?option à long terme, basée sur le mérite, qui consiste à développer des liens de manière naturelle. D?autre part, l?option à court terme, plus risquée, qui consiste à obtenir des liens spam, tel les liens achetés. Nous avons toujours eu une position claire concernant les techniques visant à manipuler l?algorithme PageRank dans nos conseils aux webmasters.



Il est vrai que certaines de ces techniques ont pu fonctionner par le passé. Cependant, Google a récemment affiné les algorithmes qui mesurent l?importance des liens. Un plus grand nombre de personnes évaluent aujourd?hui la pertinence de ces liens et corrigent les problèmes éventuels. Désormais, les sites qui tentent de manipuler le Page Rank en vendant des liens peuvent voir leur habilité à transmettre leur popularité diminuer.



Du fait que les moteurs de recherche ne prennent désormais en compte que les liens pertinents, de nouvelles techniques se sont développées pour augmenter la popularité d?un site Web. Il y a d?une part la manière classique, et légitime, qui consiste à optimiser son contenu pour obtenir des liens naturels de la part de sites aux thématiques similaires ou faisant autorité. Une technique plus récente, la pêche aux liens, (en Anglais « link baiting »), consiste à utiliser à son profit certains sites Web 2.0 dont les contenus sont générés par les utilisateurs. Un exemple classique étant de soumettre un article soigneusement prépare à un site comme http://digg.com. Un autre exemple consiste à acquérir un statut d?expert concernant un sujet précis, sur un site comme http://answers.yahoo.com. Notre conseil est simple : lorsqu e vous développez votre stratégie d?optimisation, pensez en premier lieu à vos utilisateurs plutôt qu?aux moteurs de recherche. Demandez-vous quelle est la valeur ajoutée de votre contenu pour vos utilisateurs. De cette manière, tout le monde y gagne : investir dans la qualité de votre contenu bénéficie à vos utilisateurs, cela vous permet aussi d?augmenter le nombre et la qualité des liens naturels qui pointent vers votre site, et donc, de mieux cibler vos visiteurs.



En conclusion, bien que les algorithmes récents aient mis un frein aux techniques d?échanges et d?achats de liens au profit des liens naturels, il semble toujours régner une certaine confusion sur la stratégie à adopter. Gardez donc à l?esprit, lorsque vous demandez conseil à votre expert en référencement, que les moteurs de recherche récompensent aujourd?hui le travail apporté au contenu qui attire des liens naturels.
[more]

Deftly dealing with duplicate content

At the recent Search Engine Strategies conference in freezing Chicago, many of us Googlers were asked questions about duplicate content. We recognize that there are many nuances and a bit of confusion on the topic, so we'd like to help set the record straight.



What is duplicate content?

Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Most of the time when we see this, it's unintentional or at least not malicious in origin: forums that generate both regular and stripped-down mobile-targeted pages, store items shown (and -- worse yet -- linked) via multiple distinct URLs, and so on. In some cases, content is duplicated across domains in an attempt to manipulate search engine rankings or garner more traffic via popular or long-tail queries.



What isn't duplicate content?

Though we do offer a handy translation utility, our algorithms won't view the same article written in English and Spanish as duplicate content. Similarly, you shouldn't worry about occasional snippets (quotes and otherwise) being flagged as duplicate content.



Why does Google care about duplicate content?

Our users typically want to see a diverse cross-section of unique content when they do searches. In contrast, they're understandably annoyed when they see substantially the same content within a set of search results. Also, webmasters become sad when we show a complex URL (example.com/contentredir?value=shorty-george?=en) instead of the pretty URL they prefer (example.com/en/shorty-george.htm).



What does Google do about it?

During our crawling and when serving search results, we try hard to index and show pages with distinct information. This filtering means, for instance, that if your site has articles in "regular" and "printer" versions and neither set is blocked in robots.txt or via a noindex meta tag, we'll choose one version to list. In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. However, we prefer to focus on filtering rather than ranking adjustments ... so in the vast majority of cases, the worst thing that'll befall webmasters is to see the "less desired" version of a page shown in our index.



How can Webmasters proactively address duplicate content issues?

  • Block appropriately: Rather than letting our algorithms determine the "best" version of a document, you may wish to help guide us to your preferred version. For instance, if you don't want us to index the printer versions of your site's articles, disallow those directories or make use of regular expressions in your robots.txt file.
  • Use 301s: If you have restructured your site, use 301 redirects ("RedirectPermanent") in your .htaccess file to smartly redirect users, the Googlebot, and other spiders.

  • Be consistent: Endeavor to keep your internal linking consistent; don't link to /page/ and /page and /page/index.htm.
  • Use TLDs: To help us serve the most appropriate version of a document, use top level domains whenever possible to handle country-specific content. We're more likely to know that .de indicates Germany-focused content, for instance, than /de or de.example.com.
  • Syndicate carefully: If you syndicate your content on other sites, make sure they include a link back to the original article on each syndicated article. Even with that, note that we'll always show the (unblocked) version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer.
  • Use the preferred domain feature of webmaster tools: If other sites link to yours using both the www and non-www version of your URLs, you can let us know which way you prefer your site to be indexed.

  • Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details.
  • Avoid publishing stubs: Users don't like seeing "empty" pages, so avoid placeholders where possible. This means not publishing (or at least blocking) pages with zero reviews, no real estate listings, etc., so users (and bots) aren't subjected to a zillion instances of "Below you'll find a superb list of all the great rental opportunities in [insert cityname]..." with no actual listings.
  • Understand your CMS: Make sure you're familiar with how content is displayed on your Web site, particularly if it includes a blog, a forum, or related system that often shows the same content in multiple formats.
  • Don't worry be happy: Don't fret too much about sites that scrape (misappropriate and republish) your content. Though annoying, it's highly unlikely that such sites can negatively impact your site's presence in Google. If you do spot a case that's particularly frustrating, you are welcome to file a DMCA request to claim ownership of the content and have us deal with the rogue site.


In short, a general awareness of duplicate content issues and a few minutes of thoughtful preventative maintenance should help you to help us provide users with unique and relevant content.
[more]

Dwelling on the past

As a backend engineer, one of my favorite features of Google Reader is its ability to track the history of a feed over time. Reader takes a snapshot of feeds periodically and saves the content, so you can see posts that are days or weeks old. It's a neat way to read the web; in a way, it lets you look back in time. Combined with Reader's ability to track what you have and haven't read, you can safely jet off to Tahiti for a few weeks and never miss a post.

Ideally, though, you'd like to catch up on those posts in the order they were written. That's why we're releasing one of our most requested features: sorting by oldest-first. Now you can read those Lost episode summaries in the right order after you've shook the sand out of your shoes. It's available in the view settings menu, so you can select it only for the feeds or folders you prefer.

Careful observers will note that we've also added sort by auto to view settings. This nifty feature mixes feeds together according to posting frequency, so items from rarely-updated feeds (your friend's blog) show up higher than items from frequently-updated feeds like The New York Times. Look for this feature to evolve over time as we try to find other ways of highlighting the most interesting content in your feeds. Enjoy! [more]

Savings Measured in Millions



Lakehead University became our first large-scale deployment of Google Apps for Education in Canada and shared with us some truly impressive statistics. Lakehead transitioned 38,000 students, faculty and alumni to Google Apps for Education in just one week. We think that getting all three of these groups on the same collaboration system should have a huge impact on learning (and student social calendars) as well as keep alumni involved in the campus community. Users should be excited about going from 60 MB of storage on their prior email system to 2 GB with Google Apps - eliminating the need to delete those large project files that happen to become useful come finals time.

What's most impressive is that Shahzad Jafri, Lakehead's Chief Information Officer, estimates that Lakehead will save $2-3 million in maintenance costs annually as well as $6 million in infrastructure costs - which is a big win for us and them! Read their press release for more information. [more]

Indulge in some holiday giving





For many, the holidays are both a great time to shop and a chance to help out those in need. We hope that Google Checkout can help you accomplish a bit of both: As part of our holiday promotion, donors have the option to give to a few of our nonprofit partners.



If you do decide to donate, we'll make sure your generosity goes a little farther. Every penny of your donation goes to the organization, and we'll chip in $10 for your first donation of $30 or more.
[more]

Holiday goodies from Picasa Web Albums





What?s a holiday without the memorable (and embarrassing) photos? The holidays are almost here, which (at least if you?re in my family) means babies chewing on presents, the dog dressed in a ridiculous reindeer costume, and someone (cough, Uncle Charlie) passed out after too much eggnog. Although I think about this now and wince in advance, I know that I?m going to want to capture these moments and more importantly, share them with the rest of my family and friends. That?s why I?m excited about the new features we?ve added to Picasa Web Albums, just in time for the holidays.



Print ordering is my favorite?it?s something you have told us you've wanted since we first launched. Now, when you or anyone else views photos in Picasa Web Albums, there?s an option to order prints directly from the site. We currently offer prints and products from Shutterfly and PhotoWorks, but we?ll be adding more soon.



Other new features include video upload for easy sharing (it?s just like with photos?select them in Picasa and click the ?Web Album? button) and searching tools. Now you can search over your own captions, album titles, and album descriptions, and you can even search for photos in your friends? public albums. Digging up that picture of me trying to figure out which end of the holiday turkey is "up" should be easier than ever.



So check out these new features before all the festivities start. And however you celebrate the holidays this year, I hope you?ll take lots of pictures.
[more]

Your easiest holiday task





We launched Google Apps for Your Domain at the end of August, and since then we've been getting great feedback from people all over. Organizations from Thailand, Argentina, and even our neighbors in Palo Alto have set up private-label Gmail, Google Talk, Google Calendar, and spiffy customizable start pages for their custom domains. We think it's especially cool that thousands of students are able to connect better with their classmates -- and their schools' IT directors no longer need to wring their hands over spam and clogged inboxes.



"Hey, wait, Costin," you say. "That's great for them, but our organization doesn't have a custom domain."



Well, I'm excited to let you know that we've made signing up for Google Apps for Your Domain much easier for those of you that don't yet have your own domain. We've partnered with GoDaddy.com and eNom, two leading domain registration services, to offer domains for $10 per year. And I like the fact that we're including private registration to protect your personal information.



Now you've got one-stop shopping for all the services currently on the Google Apps for Your Domain platform -- just find a domain, buy it, and get started. We'll do all the behind-the-scenes configuration work for you. For now this is available for .com, .net,.org, .biz, and .info domains, but we're working on bringing it elsewhere soon. We're also constantly working to introduce more cool new features to this service, so be sure to check back for updates.
[more]

Ad and image placement: a policy clarification

We've recently received a number of emails from publishers asking how we feel about the placement of images near Google ad units. There's been some confusion on this issue, and so we turned to our policy team to set the record straight.



Can I place small images next to my Google ads?



We ask that publishers not line up images and ads in a way that suggests a relationship between the images and the ads. If your visitors believe that the images and the ads are directly associated, or that the advertiser is offering the exact item found in the neighboring image, they may click the ad expecting to find something that isn't actually being offered. That's not a good experience for users or advertisers.



Publishers should also be careful to avoid similar implementations that people could find misleading. For instance, if your site contains a directory of Flash games, you should not format the ads to mimic the game descriptions.



What if I place a space or a line between my images and my ads? Would that work?



No. If the ads and the images appear to be associated, inserting a small space or a line between the images and ads will not make the implementation compliant.



Does this mean I can't place ads on pages with images?



You can definitely place Google ads on pages containing images -- just make sure that the ads and images are not arranged in a way that could easily mislead or confuse your visitors. For example, if you run a stock photography site with a catalog of thumbnail images, don't line the ads up with the thumbnails in a way that could be misleading. Consider using a full border around your ads or changing your ad colors, for example.



What do unacceptable implementations look like?



Here are some examples that wouldn't comply with our policies.











[more]

Sandbox Maintenance ? December 22

We will perform our regular refresh of the API Sandbox database on December 22 at approximately 2pm US Pacific Time. Therefore, the Sandbox will be unavailable for a few hours.



As a result of this database refresh, all Sandbox user and account data will be erased. Therefore, you will need to first make a request without including a clientEmail header (call getClientAccounts for example) to create your five Sandbox client accounts.



-- Jon Diorio, Product Marketing



CORRECTION: Sandbox Maintenance is Dec 21

We apologize for the last minute change, but the Sandbox maintenance period will actually be conducted at 2pm US Pacific Time on Thursday, December 21 (and not Dec 22 as previously announced).
[more]

CORRECTION: Sandbox Maintenance is Dec 21

We apologize for the last minute change, but the Sandbox maintenance period will actually be conducted at 2pm US Pacific Time on Thursday, December 21 (and not Dec 22 as previously announced).



-- Jon Diorio, Product Marketing
[more]

Enterprise Study Subjects: We Have Security Issues

Jeanine Sterling, an analyst for InfoTech: The Telecom Intelligence Group, paints a troubling — but also somewhat hopeful — picture of wireless security in her recent report, “Securing the Wireless Workplace: Partnering with the Enterprise.” The study is discussed in this release, which appeared on eMedia Wire. Enterprises begin a line of questioning with the opinion [...] [more]

Open Source Venture Funding Up in 2006

The end of a year always brings product review roundups, best and worst lists, top trends lists and, of course, predictions for what the new year will bring. The open source segment is no different. Grabbing headlines in open source these days is the fact that venture capital investments in open source have grown. Exactly how [...] [more]

Tech Predictions for the 2007 Data Center

This is the time of year when tech publications begin publishing their predictions for enterprise technology in the coming year. First up is a range of opinions gathered by SYS-CON, mostly about software development issues like Ajax and Ruby on Rails. A number of infrastructure predictions also pricked our ears, such as flash-bootable PCS, which will supposedly extend [...] [more]

Yahoo! Discloses, Patches IM Vulnerability

Yahoo! says it has no reports of actual exploits of a bug it found in its popular Yahoo! Messenger software, but the company has released a patch for the bug, which could result in a PC hijacking. Not surprisingly, the flaw is tied to ActiveX controls and the potential for creating buffer overflows. Danish security firm Secunia – which has made headlines [...] [more]

No Way to Fix Spam Problem

We read a lot of articles every week about e-mail and spam. The sad fact is that much of what is written on the topic either explicitly or indirectly concludes that e-mail has been irretrievably broken as an effective communication medium. And, yes, there are companies that have abandoned e-mail altogether, in the face of too many [...] [more]