الثلاثاء، 30 أبريل 2013

Google Adsense - The "Duplicate Content" Controversy



The hoopla over duplicate content has been going on for quite some time now, and I see it as simply just another money making scheme by online entrepreneurs wishing to chase down the Golden Goose.  Almost every day, my inbox is inundated with yet another "article converter" that is guaranteed to make my private label rights articles hit the top of the search engines with no fear of the Google Police knocking at my PR door, screaming "Duplicate Content!"
I ofttimes wonder how many of the so called gurus take the time to really read the Google Adsense Program Policies.  And I wonder many times during my working day just how many people open their wallets to let fly their hard earned dollars to these people. 
Here are Google's exact words, and I quote: "Do not create multiple pages, subdomains, or domains with substantially duplicate content."  What does this really tell us?  Does it tell us that the PLR sites that sell thousands of the same articles to people who don't have time - or are too lazy - to write their own content are breaking Google's rules?  Not hardly.  Google is telling us that we cannot create what used to be called "mirror sites" (This is a Web site which contains the same information that is located on another site. If the site abc.com is the same as def.com, then it may be disqualified from listing by search engines) in an attempt to increase Page Rank and increase Adsense income. 
Many opinions abound on the forums and elsewhere on the web discussing duplicate content.  And many netrepeneurs have taken advantage of the misinterpretation of Google's policies to capitalize on this.  Because Google has made this the era of content, everyone that is involved in the online communities is scrambling for the proper answers.  I see threads that are three to five pages in length on the more popular forums with people agonizing over their fear of duplicate content.  What a field day for the guru's!  I wonder how many thousands - perhaps millions - have been made by people taking advantage of this fear factor?
Lets examine the facts.  If there really was a duplicate content filter then many news web sites that publish AP or Reuters news would be banned from search engines.  Many catalogue sites would go under, because they sell the same products, using the same promotional items as other sites. Affiliate sites would be banned from the search engines because people use the promotional items provided by the site owners.  And even the giant eBay would go under, because anyone who has spent time there sees a ton of items listed which are identical, using the same description, same images, and same user ID.  I wonder how Copyscape.com would handle this?
What about the sites that put articles and ezines in archive. This content ends up being displayed both in static pages and archives as well.  Penalized for duplicate content, when the website owner wants to have his articles available to the general public?  I doubt it...
Common sense is the order of the day.  If you take the time to provide original and unique content to your site, the site is well optimized for the search engines, and you have relevant backlinks, then your site will do well with no fear of penalty. 
Don't use article scrapers, which mirror the exact content of other sites, and is nothing more than a rip off.  If you buy PLR articles, try to rewrite them in your own unique voice.  If your budget will allow, hire a ghostwriter to create articles pertinent to your particular niche.  And most of all, just use plain common sense!

How to build great software



In this article I'm going to explain the top 10 software development fallacies my company avoids. By avoiding these myths and concentrating on excellence, we are able to make great quality software.
Myth 1) Software must be designed in detail before development starts, so that a clear plan can be out-layed.
The truth) The more complex a design, the more like software the design itself is. By perfecting a design, then writing the software to that design, you're effectively writing the work twice. Instead, by doing just some simple design sketches and data modelling rather than a book-like design, a good development team can create a shell for the software and efficiently refine it towards the finished product. This process of refinement creates natural prototypes, allows easy adaptation when issues that would be unforseen by a design arise (or brought up as fresh concerns by a client), and the total process takes significantly less time. To pull this off requires a close team, skill, and experience, but it is by far the best option for the majority of situations.
Myth 2) There are programmers, designers, analysts, and users.
The truth) By structuring development so that all developers get some exposure to each part of the development process, skills may be shared and greater insight may be gained. If developers are encouraged to actually use the software then they can use that expertise to think of improvements that otherwise would not come to light.
Myth 3) A happy team is a productive team.
The truth) A team of people with a wide variety of natural skills, experience and concern, that criticises each other and argues vehemently over the smallest details, will bring up and resolve issues that otherwise would never be tackled. A furnace of relentless argument is the best way to forge understanding and reach perfection.
Myth 4) It's important we understand our direction and don't compromise with it.
The truth) Life is compromise, and compromise is not a weakness. There will always be issues (such as efficiency, budget, ease-of-use, power, scope, and the need for easy internationalisation) that cannot be simultaneously met without such compromise.
Myth 5) We know what the client wants, we know what the issues are.
The truth) Without constant re-evaluation, it is easy to lose track of the objective. Developers are often faced with problems to solve that they consider the issues, when those are in fact separated from the actual market goals and can become totally irrelevant. Developers must always understand the market goals and be able to adapt when other things change, or even the goals themselves change.
Myth 6) Bigger is better. Features are cool.
The truth) Features can easily confuse users, and their actual value should always be considered against the cost of confusion. In some cases it is sensible to actually remove working features due to such concerns.
Myth 7a) The customer is always right.
The truth) Most customers try hard not to look ignorant in front of software developers, and hence phrase their suggestions in a technical way. The effect is that often suggestions aren't really appropriate, because they're not founded on a solid understanding of technical issues.
Myth 7b) The customer is often wrong.
The truth) Although customers needs are often not best met by doing literally what they say, they always know what they want and why they want it - and usually for very good reason. Understand them and adapt what they say, discuss with them, but never ignore them.
Myth 8) Comment your code a lot.
The truth) Good code needs hardly any commenting, because sensible uses of naming and white-space are better alternatives. Comments should only ever explain the non-obvious, or provide standard API documentation.
Myth 9) Such and such is needed, such and such is great.
The truth) A bad workman blames his tools. Whilst some development tools aid development substantially, a good developer can do great results in most things served to them. There are a few exceptions, like Microsoft Access, or assembly language, but generally speaking the difference in quality results is much more due to the skills of the developers than the quality of their tools.
Myth 10) The customer will understand if there's an efficient and easy-to-use interface.
The truth) The interface doesn't just need to be easy-to-use, it needs to be navigatable without an overall systems understanding. Screens need to be self-describing.

Blu-ray Technology


New technology is now making it possible for viewers to record and store high definition programming onto DVDs. Blu-ray Disc is one method of recording HD content onto an optical disc. A blue-laser optical disc (MPEG-2 or MPEG-4) is used. Systems that use this technology will be able to play traditional DVDs, but the goal of Blu-ray is to create an image that's as close to the HD format as possible. The name Blu-ray comes from the blue laser that decodes and copies information to each disk. Blu-ray technology may very well revolutionize the world of high definition programming. The Blu-ray disc format offers greater potential for storage, usually 25 gigabytes, which exceeds that of a standard DVD (15 gigabytes). One single-layer Blu-ray disk can hold about four hours of high definition content. A two-layer disk can contain eight hours of HD content. Four- and eight-layer disks are now in the works. These disks would have storage capabilities of 100 and 200 gigabytes. The Blu-ray recording system utilizes a shorter wavelength for recording information than traditional CDs and DVDs, and this is part of what allows it to hold more content on a single disk.

Blu-ray has also influenced the computer industry, specifically in terms of data storage capability. A number of major companies have come out in support of Blu-ray, including Apple, Dell, Hitachi, Pioneer, and Sony. Hewlett Packard plans to market desktop computers and laptops that utilize Blu-ray technology. Sony has announced that it will introduce a Blu-ray component in PlayStation 3, which is expected to appear in November of this year. Microsoft has also said that it may add a Blu-ray component to its Xbox 360. Currently, Blu-ray is only available in Japan, but it will appear in the United States in May, in video games and a DVD system that recreates a high definition effect on a viewer's TV.

Many movie studios have Blu-ray films planned for future release. In 2005, Sony Pictures cornered the market on the first Blu-ray feature-length movie disk, which was none other than Charlie's Angels: Full Throttle. Studios that support the technology include Walt Disney and Twentieth Century Fox.

Competing with Blu-ray in the area of HD storage is HD DVD. HD DVD disks have less storage capability, but they're cheaper to produce. Other big-name companies are showing their support for this option, including Microsoft, Intel and Toshiba, as well as Universal Studios. In what may be the smartest move, some companies are backing both types of technology, ensuring that their products support both Blu-ray and HD DVD. These companies include Samsung, Paramount and Warner Brothers. 

Guide to Buying a Laptop Computer


With the many different brands and models on the market, buying a laptop or notebook computer may at first seem confusing. But simply breaking down the process into a few key areas and using Myshopping.com.au to search for the most suitable features and pricing makes it much easier to access exactly the machine you're looking for. 

How important to you is mobility? 

Mobility in laptop computers is a combination of size, weight and battery life: how often you carry it around, and if you will be relying mainly on the notebook's battery, or whether you will access an external power source. Laptops can weigh from a little over 1 kg up to 6kg, depending on the model and features included. The screen, storage space and disc drive all affect the weight. 

Battery life is shortened by bigger screens and multiple disc drives. Manufacturers advertise the weight of the laptop in their specifications, but it is important to consider whether that specification includes batteries and other peripherals such as external drives that you may be lugging around. The most common battery type is Lithium Ion (Li-Ion), which can operate for one to three hours under normal working conditions. But many power saving options available and higher celled batteries can extend the discharge time considerably. Battery life deteriorates over time however, and as your laptop ages; the discharge rate of the battery will diminish. Sometimes it is worthwhile carrying an additional battery. 

Application and cost 

If mobility is of a lesser concern, then battery life and weight will be less important. You may be more inclined to have a bigger processor, screen size and memory capacity. The type of work you do can affect the screen size and type that is most suitable for you. For a lighter load, and less graphics intensive applications a 12-14in screen instead of 15 or 17in widescreen will be more suitable. If, on the other hand, the graphics capabilities and size of the screen are important then the best screen you can afford will be more of a priority. It may work out cheaper to buy a basic unit and add such things as an external TV card and DVD burner when the need arises. 

How much you need to spend is closely related to how you use your laptop. If you only want to access your e-mail, browse the Web and do word processing, then you can consider lower budget machines with smaller processors, screens and facilities. 

A medium-level user, perhaps playing games or working in multimedia applications, will need a powerful processor, graphics controller, storage space, and a bigger screen. The more features your laptop has, the more expensive it will be. Including a DVD-burner instead of DVD-ROM, hard drive capacity of more than 40GB, a 17in widescreen screen and wireless capabilities results in a more expensive machine. 

If you are not looking for high power and graphics capabilities, then you may find a suitable laptop for around 1500. The latest processor, full blown graphics capability, DVD burner, widescreen and wireless connectivity may cost over 4000. Use Myshopping.com.au to search with different price ranges. 

Other key components 

Having determined by what you will do with it, and how mobile you need to be that you are definitely buying a laptop, you now need to get down to the nitty gritty and find the specifications that will meet your needs. So, what to look for? Essentially, you are considering differences between the following components: display, graphics controller, memory (RAM), hard disk, removable storage, networking options, peripheral connectivity, sound and battery. 

Display and Graphics 

Notebooks now all feature LCD screens (Liquid Crystal Displays) presenting crisp text and reduced eyestrain. These screens display sharper text than standard CRT monitors, but are less capable of displaying well-rendered graphics. If you will be using your notebook for graphics work, it may be worthwhile having a CRT monitor to connect to. Screen sizes for notebooks range from 12.1in to 17in (widescreen). A 15in display or 15.4in widescreen alternative is the most common in notebooks today. Widescreen is quickly becoming more common, partly to accommodate playback of DVDs and also because widescreen proportions make it is more durable. 

On-screen graphics are affected by both the size and type of screen as well as the graphics card. It is reasonably safe to assume that larger displays offer higher on-screen resolution. Screen brightness (measured in nits) is another specification that can vary between makes and models. Brighter screens impact less on eyes and can be more easily read in bright conditions. Some manufacturers have a glossy, reflective coating over the display improving contrast and colours. But, because it increases the reflectivity of the screen, it can show you reflected in the screen. Surface scratches may also show up more readily. Not all LCD screens have the same viewable angle, with some screens not easily viewed from a side angle. 

Graphics performance in laptops is still inferior to that of desktop machines. All graphic controllers easily render 2-D images and if you don't need more from your graphics, then an integrated graphics controller is ample. However, if you want to play the latest 3-D games at a decent resolution and frame rate or you're a CAD designer, then you'll need a discreet graphics controller with a dedicated DDR video memory. 

Memory and Storage 

In all computers RAM chips keep the CPU efficiently fed with data or instructions from programs on the hard drive. Notebook computers now commonly use DDR SDRAM (Double Date Rate SDRAM), the default standard, and DDR2 SDRAM which is a next-generation memory type offering considerable performance and power benefits over SDRAM. Either way, when it comes to RAM, more memory is better and you should consider 256MB as the absolute minimum. Upgrading memory can achieve better performance, and quite a number of vendors offer higher RAM configurations as a 'deal sweetener' at the time of purchase. Search through Myshopping.com.au for bundled extras such as more RAM. 

The hard drive provides the long-term storage and is the centre of program control. There are two critical specifications of hard disks. One is disk speed, measured in revolutions per minute (rpm). Faster disks speeds provide quicker access for loading and saving and 'file swapping'. The other is storage capacity, and drives are now available for notebook computers with 120GB capacity. If you work with large file sizes, then you will probably want at least 40GB of hard drive space. You may also want to consider the type of removable storage such as a DVD writer, removable hard disks and media or 'flash' card systems that will suit your use best. 

Networking and connectivity 

Laptop computers now include 56Kbps modem (RJ-11) and 10/100 Ethernet (RJ-45) connections as standard features. Some feature an Infrared port and you can use it to connect your mobile phone. Other wireless technology for connecting mobile phones, printers and PDA devices includes Bluetooth and Wi-Fi, allowing connection at certified public access points and home wireless networking. Most laptops use USB 2.0 or FireWire connection for connecting keyboard, mouse, printers, cameras and other peripherals. Nearly every new notebook will have around three USB 2.0 ports, and one FireWire port and a VGA-out port to connect an external monitor to. 

Notebook computers have traditionally been able to expand their capability through simple plug-in PC Cards. Recently a new standard has emerged called ExpressCard, a smaller, faster and more portable plug-in card to provide such things as expanded video and sound capacity. 

Choosing a laptop becomes much easier once you've decided on these basic requirements. You can search Myshopping.com.au to compare makes, models, prices, accessories and all the important specifications. You can also compare vendors and their prices and service. 

Buying Guide to Graphics Cards

The graphics card is a vital performance component of your computer, particularly if you play 3D games, or work with graphics and video content. The graphics card sits in an expansion card slot in your PC and it is specifically designed to process image data and output it to your monitor, enabling you to see it. A graphics card works by calculating how images appear, particularly 3D images, and renders them to the screen. 3D images and video images take a lot of processing capacity, and many graphics processors are complex, require fans to cool them and need direct power supply. The graphics card consists of a graphics processor, a memory chip for graphics operations, and a RAMDAC for display output. It may also include video capture, TV output and SLI and other functions. You can find the graphics card that suits you by comparing specification between brands and vendors on Myshopping.com.au

At Myshopping.com.au you can compare a great range of appliances, and assess them according to their specifications, brands, prices and vendors.

Graphics Cards

What are your needs?

The first decision you need to make is whether you need a graphics card for handling 3D images or whether you are simply requiring 2D image rendering. For 2D requirements, you need only a low-cost solution. In many cases, an integrated graphics solution will suffice for 2D applications.

However with 3D graphics, the performance of the graphics card will impact directly on the frame rate and image quality of 3D programs and games. The differences between the low and high-end cards can be substantial, both in cost and performance.

Rendering 3D graphics is like lighting a stage, both the geometry of the shapes in question and the lighting of it need to be taken into account. The geometry of an image calculates the parts of an object that can and can\'t be seen, the position of the eye and its perspective. The lighting is a calculation of the direction of the light sources, their intensities and the respective shadows that occur. The second part to presenting a 3D image is the rendering of colours and textures to the surfaces of the objects, and modifying them according to light and other factors.

Most modern graphics cards include a small microchip called the Graphics Processing Unit (GPU), which are provide the algorithms and memory to process complex images. They reduce the workload of the main CPU, and provide faster processing. Different graphics cards have different capabilities in terms of processing power. They can render and refresh images up to 60 or more times per second, calculate shadows quickly, create image depth by rendering distant objects at low resolution, modify surface textures fluidly and eliminate pixelation.

What Specifications to Consider

Processor clock speed

This impacts on the rendering capability of the GRU. The clock speed itself is not the critical factor. Rather it is the per-clock performance of the graphics processor, which is indicated by the number of pixels it can process per clock cycle.

Memory size

This is the memory capacity that is used exclusively for graphics operations, and can be as much as 512MB. The more demanding your graphics applications are, the better you will be served with more memory on your graphics card.

16-32M
64M
128M
256M
512M
640M and more

Memory bandwidth

One thing that can slow down 3D graphics performance is the speed at which the computer delivers information to the graphics processor. A higher bandwidth means a faster data transfer, resulting in faster rendering speeds.

Shader model

DirectX Shader Models allows developers control over the appearance of an image as it is rendered on screen, introducing visual effects like multi-layered shadows, reflection and fog.

Fill rate

This is the speed at an image can be rendered or \"painted\". This rate is specified in texels per second, the number of 3D pixels that can be painted per second. A texel is a pixel with depth (3D). The fill rate comes from the combined performance of the clock speed of the processor and the number of pixels it can process per clock cycle, and will tell you how quickly an image can be fully rendered on screen.

Vertices/triangles

Graphics chips don\'t work on curves, rather they process flat surfaces. A curve is created by multiple flat planes arranged to look like a curve. 3D objects are created with multiple triangular surfaces, sometimes hundreds or even thousands, tessellated to represent the curves and angles of the real world. 3D artists are concerned with the number of polygons required to form a shape. There are two different types of specification: vertices per second (I.e., angles the triangles), and triangles per second. To compare one measure with the other, you have to take into account the fact that adjacent triangles share vertices.

Anti-aliasing

A technique used to smooth images by reducing the jagged stepping effect caused by diagonal lines and square pixels. Different levels of anti-aliasing have different effects on performance.

RAMDAC

The Random Access Memory Digital to Analogue Converter takes the image data and converts it to a format that your screen can use. A faster RAMDAC means that the graphics card can support higher output resolutions. Some cards have multiple RAMDACs allowing that card to support multiple displays.

TV-out

Some graphics cards provide the option to connect a television via either a composite (RCA) or S-Video connector. TV Out

S-video Out
S-video In and S-video Out (VIVO)
YPbPr Connection for HDTV

DVI

Some graphics cards include a connector for DVI monitors, handy because a lot of LCD screens support DVI. DVI offers better image quality than the standard VGA connector.

Dual-head

Dual-head is a term used when two monitors are used side by side, stretching your desktop across both.

SLI (Scalable Link Interface.)

With SLI you can couple two graphics cards in your computer, enabling each card to take half the rendering thereby doubling the performance.

When considering your graphics card, it pays to think about how much you need your computer to process your graphics output. Using a high end graphics card with a high pixels per clock rating, large memory, fast processor and other features means that you can run the latest games efficiently, or work in intensive graphics development.

Different Models

While there are many vendors of graphics cards, there are actually only two major manufacturers of chips for graphics cards. Nearly every graphics card on the market features a chip manufactured by either ATI or Nvidia. Cards using the same graphics chip will perform roughly the same as each other. However, even though they use the same chip, some feature slightly higher clock speeds, as well as manufacturer guaranteed overclocking-an even higher clock speed than that specified. Other factors that will influence your decision should include the amount of memory a card has (128MB, 256MB, 512MB) and its additional features, such as TV-Out and dual-screen support.

Use the search facilities at Myshopping.com.au to compare the features, prices and vendors of graphics cards.

Andrew Gates is a writer for comparison online shopping service - http://www.myshopping.com.au , MyShopping.com.au helps you compare video cards - http://www.myshopping.com.au/PT--72_Graphics_Cards and buy online from top-rated online stores. You can also read graphics cards reviews - http://www.myshopping.com.au/PT--72_Graphics_Cards and specifications. 

A Journey From Video Game To Online Computer Game


Video games have been a significant force in society and one of the most popular leisure, more or less a solitary pursuits in those days of the late '70's and early 80's that relied on graphic improvements and better ways of shooting the enemy. There were games like Atari, Intellivision, Colecovision, Sega and Nintendo. With the rise of the Internet and online games however, lots of things changed, including the ability to download games and playing online games, making games a more of social activity, with lots of players, or opponents playing with each other from the different corner of the world thus making people from the different parts of the world come closer and interact with each other while playing.   

It was Magnavox and their "Odyssey" system in 1972, which were the earliest video games that included twelve simple games with graphic overlays. It was very simple and needed lots of improvement. Seeking the opportunity Nolan Bushnell along with Al Alcorn, the founder of Atari created Pong complete with built in paddles, and a speaker and released it in the market after one year, which came out to be a great success. At the same time another video game Atari dominated the market. Among the other video games that were famous worldwide were Pac Man, the yellow blob that ate up dots and avoided squid-like ghosts, Space Invaders, Super Mario, Zelda, Metroid, and other classics. 

With the running time industry was trying for more efficient system and as a result the entire industry adopted the implementation of the microprocessor due to which these systems produced groundbreaking and innovative graphical and auditory effects that had never been seen before. Millions of dollars were spent on video arcade machines and on home video game systems. Atari's VCS/2600 system still dominated the market throughout 1982, when the gaming market experienced a crash due to the loss of public interest in video game specific consoles, and sales dropped.

The video game history took a new turn with the end of the reign of Atari and with the two innovations in the year 1984. The two innovations were reduction in cost of Dynamic RAM (DRAM) chips, which allowed more memory, and the production of higher power 8-bit processors, which lowered the prices of the previous chips. Sega and Nintendo of Japan entered the console market and would battle over the next five years for dominance. Recently in this advance age of technology where the gaming market is saturated with hi-tech online computer games battle for domination is still in existence and this could be seen between PlayStation 2, the X box and the Game Cube.  
 

Blogger news

Blogroll