Tuesday, June 17, 2008

Life Is Great With Modern Day Cheap Laptops

The modern day computer market is a completely different place to shop than what it was some decades ago. In the current market, the bulky and traditional computers find no place and are effectively replaced by smart, sleek and performance-oriented laptops.

A laptop is an integral part for many of us for some reason or the other. While the office executives make use of it for performing their duties even while on the move, the housewives and students make use of it to search the Internet or valuable information for miscellaneous purposes.

Let us study some of the numerous benefits of a laptop over the traditional desktop computers. A laptop offers higher levels of portability, flexibility, reliability and is space-conscious. The emergence of new market players, intense competition, falling price, higher levels of customer satisfaction and favourable market trends have all contributed to the growth of the laptop market in the recent years.

The modern day cheap laptops offer exceptional solutions when it comes to complex needs, high-quality image resolution, viewing angle and contrast ratio. Most of the latest laptops score heavily when it comes to satisfying the ever-growing needs of their users.

If you are planning to buy a fresh or second-hand laptop, you must consider some critical factors such as performance, price, substitute products, prevailing market trends, customer service and brand awareness. You must demand the original bill and warranty documents from the vendor and stay away from the grey market. It is of no use to buy the cheap laptops and regret at a later stage. Do not get lured by false promises and tall claims of profit-motivated vendors who leave no stone unturned to get easy and innocent customers.

Some of the leading names in the world of cheap laptops are Acer, HCL and Compaq. The advanced and stylish Acer laptops such as the Acer Aspire 5583WXMi, Acer TravelMate 4233WLMi, Acer TM 6292 and Acer Ferrari 1005WTMi offer complete value for your money. These highly elegant and power-packed Acer laptops will never let you down with sub-standard performance and will reduce your burden with some exceptional quality and performance on a consistent basis.

Thus, it can be easily concluded that the present day cheap laptops are great assets for each one of us. The only thing we need to remember is that it' s better to wait for a purchase than to buy a low quality laptop.


Source : http://www.articlealley.com/article_555590_10.html

How to Design A Powerful Inventory Software?

I have been design and developing inventory tracking software for the pass 7 years, and from my experience, the most difficult part of Inventory software is the month end calculation. Any programmer can do that, but if the data grow from 4 GB per month, improper design will just kill the database in less than a year!

Let me share some ideas I use to counter such a problem:

Idea #1 - Design the system to batch processing.
You must admit you can't have all eggs in a basket! While the market is screaming for Realtime software, it is just not practical to invoke the business rules every-time someone entered a transaction. Any inventory system design in this way will require 10 times processing power, so unless it is a must have feature, I won't recommend to design the system to cater for Realtime processing at all.

Idea #2 - Consolidate the processing into daily, weekly and monthly.
You need only to store the monthly summary for a large inventory system, but you must process and store weekly and even daily data temporarily. Every day summary will be consolidated into the weekly summary, then clear daily data for next week daily process. Perform the same with the weekly data after consolidated into the monthly table. If you are using Microsoft SQL server, extra care must be taken when clearing the temporary data. There are some special way other than SQL "delete from" command to speed up the process!

Idea #3 - Report processing.
Do not store all report summaries into tables, use the tables as summary cache. Trust me on this, there will be many report to come, and every time you need a report, a summary table will be created, then the database will grow into a monster very fast! Instead, combine daily, weekly and monthly data in the process and use SQL statement to join the different table into logical dataset, and you report from there! Unless there are many people requesting the same report all the time, you might want to consider create data warehouse for the reporting purposes.

Idea #4 - Use multi-tier technology.
Multi-tier technology allows you to separate the system business rules from processing, may be even taking some of the processing, such as reporting processing from the database server into the application server, load balance the machine processing power and provide better user experience.

There are still many things to consider, but these 4 major ideas should outline the critical requirement of the heavy duty Inventory software. Just keep in mind the design process must always align with the business environment of your client, the volume of data and the business nature will decide how the system should handle the processing, so keep study, study and study until you understand what is your client requirements...


Source : http://www.articlealley.com/article_555674_10.html

What Makes VS2008 500% Faster in Coding?

Do you know you can do programming 5X faster than before? In Visual Studio 2008 there are some tools many developers ignore since Visual Studio 2005, but I must stress these tools has been improved dramatically to better support database programming, and I won't be surprise to write database application few times faster than conventional programmer!

OK, you might hear something called RAD long time ago, but I don't think many of you take it seriously. I use RAD since Borland Delphi, and I am impressed that VS2008 bring this even further! If you have no idea what is RAD, you must continue to read on, I guarantee this will open your eye to the new programming era.

So what is RAD? RAD stands for Rapid Application Development, is a set of tools or code generator to map database layer into logical object, without a single code! You can even override the event and create your own insert, update and delete business rules.

The dataset designer has been introduced since VS2003, but at that time the technology still not mature. Since VS2005, dataset designer has been evolved into powerful object mapping tools. You can define all business rules in the dataset designer, logically separate all interfaces with data. There are still some limitation, in theory the dataset should be portable to ASP.Net, if this can be achieved, imagine how easy to port existing windows base clients into Web base clients!

VS2008 introduce a new component for ADO.NET, called TableAdapterManager. This powerful component allows developer to easily perform Master and Details Hierarchical updates in few clicks! These are not possible few years back without lots of codings.

The framework does not stop here, new VS2008 extend n-tier support further by separating data access object from dataset designer, now the client can easily share data access interface with the server, developer no longer needs to separate the interface manually!

The enhancement still continues, many exciting features will be included into VS2008, currently available as beta, such as Silverlight and Microsoft Sync framework for ADO.Net, and many more...


Source : http://www.articlealley.com/article_555749_10.html

What is WDM (Wavelength Division Multiplexing) for Fiber Optic Communication? Fiber Optic Tutorial

What is WDM?

WDM is the abbreviation for Wavelength Division Multiplexing. What it does is to split the the light in an optic fiber into a number of discrete wavelengths (colors). Each wavelength (color) is a independent channel running at data rate at 2.5Gbit/s, 10Gbit/s, 40Gbit/s or even 100Gbit/s (still under development). So if the light in the fiber is split into 16 wavelengths (colors or channels), and each wavelength is running at 40Gbit/s data rate, we get a total of 40Gbit/s x 16 = 640Gbit/s rate. This is especially true in long haul and ultra long haul fiber optic communication links.

In addition, fibers carrying 64 and more channels (wavelengths) are already available on the market now. Which means we can run 2,560Gbit/s data rate on a single fiber. How about 48 fibers in a single fiber optic cable? That gives us an amazing 2,560Gbit/s x 48 = 122,880Gbit/s link. Of course, this kind of high speed and high fiber count links are usually only deployed for Internet backbones.

From aforementioned samples, you can see the shocking truth about WDM. It dramatically increases capacity of a fiber optic link while minimizes equipment and fiber optic cable cost.

What is DWDM?

DWDM stands for Dense Wavelength Division Multiplexing. Here "dense" means the wavelength channels are very narrow and close to each other. For 100 GHz dense WDM, the interval between adjacent channels are only 100 GHz, (or 0.8nm). For example, the adjacent channels could be 1530.33nm, 1531.12nm and 1531.90nm.

DWDM are widely used for the 1550nm band so as to leverage the capabilities of EDFA (Erbium Doped Fiber Amplifiers). EDFAs are commonly used for the 1525nm ~ 1565nm (C band) and 1570nm ~ 1610nm (L Band).

Why is DWM so important?

The exploitation of DWDM has fueled an explosion in transmission capacity. The amount of information that can be sent over the fiber cables that span the world has increased so much that there is now a glut of available capacity.

In practice, more can be wrung out of DWD systems by extending the upper or lower bounds of the available transmission window or by spacing wavelengths more closely, typically at 50GHz, or even 25 GHz. In doing this, suppliers can double or triple the number of channels. Each optical channel can currently be routinely used for transmission of light pules at 10Gbit/s, or even higher data rates at 100 GHz spacing. With the help of WDM, a pair of fibers can provide data capacity of several hundred gigabits per second.

WDM technology does not require any upgrade or replacement of the fiber infrastructure that has been put in the ground. Hence, we can upgrade links from one capacity level to the next simply by reconfiguring or upgrading terminal equipment and repeaters.

WDM technologies provide the raw transmission capacity. This has to be structured in some way so that it can carry useful traffic and be routed where it needs to go. This is where the next layer of network protocol comes to play. SDH and SONET (They are equivalent. SONET is used in the United States while SDH is used in the rest of the world). We will touch on SDH and SONET in some other tutorials.


Source : http://www.articlealley.com/article_555851_10.html

Some Basic Concepts of Fiber Optic Loss Testing

When testing loss in a fiber optic link, some basic principles must be kept in mind all the time.

1.The testing wavelength should always be the same as the working wavelength. Because optical fiber loss varies with light wavelength, you will get incorrect result if your measuring wavelength is different from the actual working wavelength. For example, if a system is designed for 1550nm but you test it with 1310nm light source and power meter, the result will not be correct.
2.The testing light source should be the same as the intended working lightwave equipment light source. If the system is designed for a LED source, you should test it with a LED source. If the system is designed for multimode laser light, you should use a multimode laser light source for testing. This is also true for single mode laser light source.

Fiber optic equipment used in a loss testing

In a basic loss testing setup, four types of test equipment are needed. They are the light source, the power meter, the reference patch cables and the adapter (mating sleeve).

Here are some considerations when choosing your equipment.

The light source should have the same wavelength as the operating equipment, proper mode (multimode or single mode, should be same as the operating equipment), type (LED or laser, same as the operating equipment) and proper connector.

The power meter should have the same wavelength as the light source, proper connector and calibrated.

The reference patch cables should be high quality with know loss, proper connectors and be the same type as the fiber plant being tested.

The adapter (mating sleeve) should be with high quality ceramic sleeves and be proper type (FC, SC, LC, etc).

Understanding dB (decibel) in fiber optic loss testing

As in any power measurement, fiber optic light power measurement unit can be expressed in milliwatt (mW), but a more convenient unit is dB(decibel).

Decibel (dB) is most often used in electronics testing. It is the ratio between two levels. One level is the input and the other level is the output. The ratio is calculated in logarithmic as explained below.

For power measurement, dB is defined as: dB = 10 x log(output power/input power)

So for example, after a fiber link, the output light power level becomes 50% of its input, the loss of the link will be 10log(0.5)= -3 dB.

Since dB is actually a ratio, it has no absolute units. So from above measurement sample, we have no idea of the actual power, may it be 0.1 mW or 1 mW.

That is why we have another unit dBm. It is the ratio of the measured power to 1mW of reference power. It is defined as: dBm = 10xlog(measured power/1mW)

So for example, a 0.1mW light power expressed in dBm will be 10xlog(0.1mW/1mW)=-10 dBm.

From above we know that dBm is a absolute unit, we know exactly how many mW it is.

For fiber optic loss testing, decibel is the most often used unit since it is much easier to work with. Why? Because two dB values can be simply added or subtracted. For example, a total fiber link may have three sections, each has loss of 0.5dB, 5dB and 0.5dB. The total loss can then be easily concluded as 0.5dB + 5dB + 0.5dB = 6 dB. You can try to convert it to actual milliwatt and you will see that I am right!


Source : http://www.articlealley.com/article_555859_10.html

How Computer Enclosures Can Help Improve Production

We have noticed computer enclosures are being used more and more by large multi-national businesses, in the demand for improved production and reduced waste costs. Computer enclosures are manufactured from steel and epoxy powder coated to protect them in shop floor environments. This means that standard computers and printers can be used in areas we thought unsuitable previously, especially were water and industrial dust is present.

Major manufacturers such as Ford Motors, SCA, Pilkington Glass and the like are implementing data capture solutions on the shop floor, this means at any one time, management can see how productive a certain employee is and how much waste they create per shift. (Big brother watching us…) Therefore improving the company’s productivity and profit. Software such as ERP, CMS and SAP are the leaders in their field, giving total control of the business to the management.

The benefits of using a standard computer and computer enclosures over an Industrial PC are: if an Industrial PC has a technical problem the user has to wait a number of days before an engineer can service the unit and they normally charge a high hourly rate. Using a standard PC and computer enclosure, if there is a technical issue, the user can quickly replace the computer with a unit from an office, minimising the production downtime and with limited costs.

A cost comparison was carried out against Industrial PC’s and the standard computer and enclosure is the most cost effective way to collect shop floor data. Resulting in a saving of around 75% on an Industrial PC.


Source : http://www.articlealley.com/article_555891_10.html