Search This Blog

Tuesday, November 28, 2006

Interview Tips

1. If you do not recognize a term ask for further definition. You may know the methodology/term but you have used a different name for it.

2. Always keep in mind that the employer wants to know what you are going to do for them, with that you should always stay/be positive.

3. Don't feel as though you need to literally answer the question.

4. Be brief in your response, yet be prepared to expand should you be asked to do so, or should you judge it a good opening for a more complete response.

5. Be positive in your responses, even when the question is negative.



To help you get started, here are 10 probable questions that you will be asked during the interview process, along with some suggestions on how to answer each.


1. Why did you send me your resume?

Suggestions: Emphasize what you can do for them versus what you want from them. Shape your response along the lines of, "I sent my resume with the belief that my skills of A, B and C, along with my experience of X, Y and Z, would prove a valuable resource to your company."

Example: "I sent you my resume, highlighting my technical talents in IT along with my managerial experience, confident that you would see me as a valuable resource. Allow me to be more specific as I learn more about your immediate needs."


2. Tell me about yourself...

Suggestion : It is tempting, with such a question, to tell your life story. Resist this temptation. Respond with a quick overview highlighting your skills and accomplishments, and flavor your answer with a comment about your style and personality.

Example: "My career is a balance between the technical and managerial. Early in my career, I focused on providing technical solutions to business problems, and, more recently, on providing solid leadership in the re-engineering of the IT department. Beyond my many skills and experience, I believe my success is due largely to my ability to quickly engage others, motivate team members and clearly communicate."


3. How can you help us?

Suggestion: If there are identifiable corporate needs, than address them; if not, then be careful not to go out on a limb saying you can do certain things that may not be of interest. Rather, turn the question and response from what you CAN do, to what you HAVE done.

Example: "Not knowing the particulars of your current situation, allow me to share with you some recent successes which are representative of my talents. (give an example and quantify) With that said, may I ask how that begins to address your needs?"


4. What are your strengths?

Suggestion: Be selective of your many strengths and choose three or four that directly relate to the position or identified corporate needs. Sometimes a mere listing of skills has high impact and is most effective. Yet, be prepared to expand and give examples if prompted.

Example: "Three strengths or skills quickly come to mind: namely, 1, 2 and 3. (pause) I mention these particular skills or strengths because they have served me well throughout my career and, I believe, would prove valuable to your company as you address issues of A, B and C."


5. What are your weaknesses?

Suggestion: Minimize your response. Even though the interviewer might ask for "weaknesses," give just one at a time. Make it realistic, yet don't give ammunition to fire back at you. Mention some "weakness" that in a different context might be considered a strength.

Example: "I'm tenacious and I hate to give up, yet I do realize the value of time and the importance of not taking decisions to halt an effort as a personal failure."


6. What types of problems do you like to deal with?

Suggestion: Not all problems are negative. Sometimes it might involve fixing something that is broken, although it could also involve capitalizing on an opportunity. Reflect for a moment on the position and why that position exists, then reference some responsibilities and activities in which you thrive and which you believe would contribute to that company's productivity, grow and profit.

Example: "I enjoy a mix of problems, both the quick daily problems that I can address based on my knowledge and experience, as well as larger problems that involve input from many sources, careful analysis and strategic thinking. My preference, too, is toward quantitative issues that can be resolved in a matter of weeks."


7. How do you motivate others?

Suggestion: Think for a moment about human behavior and your philosophy of managing; then comment on your style of encouraging and supporting others toward completing a common goal. Be prepared with concrete examples.

Example: "Ultimately, a person can only motivate himself/herself. The role of the manager is to understand what motivates individuals and provide the support for that person to be successful. It takes patience in listening and understanding others, an expressed confidence in the person and a reinforcement of that person's progress toward a goal."


8. What would your boss say about you?

Suggestion: Although we have all been criticized from time to time, be positive and highlight some of your qualities and strengths. If your relationship with your boss is strained or negative, generalize the response to bosses you have had over the years and the compliments that they have paid you. Once again, choose comments that reinforce your candidacy for the position for which you are interviewing.

Example: "I've had the good fortune of having bosses who are very supportive of my work. They have commonly complimented me on my analytical and problem-solving skills. In addition, they recognize my leadership abilities and willingness to push myself and others toward concrete results."


9. What do you hope to be doing in three years?

Suggestion: Indicate that you would be first looking to do an outstanding job in the position under discussion. As to the future, show your ambition, yet be realistic and avoid mentioning positions by title. Be careful not to set yourself up as a competitor with the interviewer.


Example: "First, let me say that I would certainly look forward to being part of your team and am confident that I could do an outstanding job. As to the future, I think my success would bring other responsibilities. So, regarding your question, I would see myself playing an increasingly important role in the company's leadership and growth."


10. Why should I consider hiring you over other candidates?

Suggestion: Emphasize that, while not knowing the other candidates, you are confident that you have the skills, experience and demonstrated accomplishments that have prepared you for such a role and that assure your future success. This question provides a perfect opportunity to close by asking for feedback and the interviewer's full support in moving forward.

Example: "I have been in your position of making hiring decisions, and I have always asked the question, 'Who is skilled, motivated and most likely to fit with the team?' From our discussion, I hope I have clearly conveyed my skills and experience, and my strong desire to be part of your team. Bottom line, I am confident I can get the job done for you in a timely and profitable manner."

What is Software Development Life Cycle (SDLC)?

The various activities that are undertaken when developing software are commonly modelled as a software development lifecycle. The software development lifecycle begins with the identification of a requirement for
software and ends with the formal verification of the developed software against that requirement.

The software development lifecycle does not exist by itself; it is in fact part of an overall product lifecycle. Within the product lifecycle, software will undergo maintenance to correct errors and to comply with changes to requirements. The simplest overall form is where the product is just software, but it can become much more complicated with multiple software developments, each forming part of an overall system to comprise a product.

There are a number of different models for software development lifecycles. Some of the more commonly used models are:

-> Waterfall Lifecycle Model
-> Modified Waterfall Lifecycle Model
-> V Lifecycle Model
-> Progressive Development Lifecycle Model
-> Iterative / Incremental Lifecycle Model
-> Spiral Lifecycle Model
-> Prototyping Model
-> RAD Lifecycle Model

Levels in CMM

Level 1
Characterized by chaos, periodic panics, and heroic efforts required by individuals to successfully complete projects. Few if any processes in place; successes may not be repeatable.

Level 2
Software project tracking, requirements management, realistic planning, and configuration managementprocesses are in place; successful practices can be repeated.

Level 3
Standard software development and maintenance processes are integrated throughout an organization; a Software Engineering Process Group is is in place to oversee software processes, and training programs are used to ensure understanding and compliance.

Level 4
Metrics are used to track productivity, processes, and products. Project performance is predictable, and quality is consistently high.

Level 5
The focus is on continouous process improvement. The impact of new processes and technologies can bepredicted and effectively implemented when required.

Difference between Quality Assurance (QA) and Quality Control (QC)

What is Software Quality?
Though there are a number of definitions propounded by the gurus in the field - each having its own adherents, two definitions that are widely accepted and that are complementary to each other are:
- Conformance to explicitly stated functional and performance requirements, explicitly documented development standards, and implicit characteristics that are expected of all professionally developed software.
- The degree to which a system, component, or process meets specified requirements and customer or user needs or expectations.

Quality assurance and quality control both contribute in delivering a high quality software product though the way they go about it is different. This can be illustrated by looking at the definitions of the two.

What is Software Quality Assurance?
Software QA involves the entire software development PROCESS - monitoring and improving the process, making sure that any agreed-upon standards and procedures are followed, and ensuring that problems are found and dealt with. It is oriented to ’prevention’.

This is a ’staff’ function, and is responsible for establishing standards and procedures to prevent defects and breakdowns in the SDLC. The focus of QA is prevention, processes, and continuous improvement of these processes.

What is Software Quality Control?
This is a department function, which compares the standards to the product, and takes action when non-conformance is detected for example testing.This involves operation of a system or application under controlled conditions and evaluating the results (e.g., ’if the user is in interface A of the application while using hardware B, and does C, then D should happen’). The controlled conditions should include both normal and abnormal conditions. Testing should intentionally attempt to make things go wrong to determine if things happen when they shouldn’t or things don’t happen when they should. It is oriented to ’detection’.

Relationship between QC and QA
An application that meets its requirements totally can be said to exhibit quality. Quality is not based on a subjective assessment but rather on a clearly demonstrable, and measurable, basis. Quality.

- Quality Control is a process directed at validating that a specific deliverable meets standards, is error free, and is the best deliverable that can be produced. It is a responsibility internal to the team.
- QA, on the other hand, is a review with a goal of improving the process as well as the deliverable. QA is often an external process. QA is an effective approach to producing a high quality product.

One aspect is the process of objectively reviewing project deliverables and the processes that produce them (including testing), to identify defects, and then making recommendations for improvement based on the reviews. The end result is the assurance that the system and application is of high quality, and that the process is working. The achievement of quality goals is well within reach when organizational strategies are used in the testing process. From the client's perspective, an application's quality is high if it meets their expectations.

What is SEI? CMM? CMMI? ISO? IEEE? ANSI?

SEI = 'Software Engineering Institute' at Carnegie-Mellon University; initiated by the U.S. Defense Department to help improve software development processes.

CMM = 'Capability Maturity Model', now called the CMMI ('Capability Maturity Model Integration'), developed by the SEI. It's a model of 5 levels of process 'maturity' that determine effectiveness in delivering quality software. It is geared to large organizations such as large U.S. Defense Department contractors. However, many of the QA processes involved are appropriate to any organization, and if reasonably applied can be helpful. Organizations can receive CMMI ratings by undergoing assessments by qualified auditors.

ISO = 'International Organisation for Standardization' - The ISO 9001:2000 standard (which replaces the previous standard of 1994) concerns quality systems that are assessed by outside auditors, and it applies to many kinds of production and manufacturing organizations, not just software.It covers documentation, design, development, production, testing, installation, servicing, and other processes.
The full set of standards consists of:
(a)Q9001-2000 - Quality Management Systems: Requirements;
(b)Q9000-2000 - Quality Management Systems: Fundamentals and Vocabulary;
(c)Q9004-2000 - Quality Management Systems: Guidelines for Performance Improvements.

To be ISO 9001 certified, a third-party auditor assesses an organization, and certification is typically good for about 3 years, after which a complete reassessment is required. Note that ISO certification does not necessarily indicate quality products - it indicates only that documented processes are followed. Also see http://www.iso.ch/ for the latest information.

IEEE = 'Institute of Electrical and Electronics Engineers' - among other things, creates standards such as 'IEEE Standard for Software Test Documentation' (IEEE/ANSI Standard 829), 'IEEE Standard of Software Unit Testing (IEEE/ANSI Standard 1008), 'IEEE Standard for Software Quality Assurance Plans' (IEEE/ANSI Standard 730), and others.

ANSI = 'American National Standards Institute', the primary industrial standards body in the U.S.; publishes some software-related standards in conjunction with the IEEE and ASQ (American Society for Quality).Other software development/IT management process assessment methods besides CMMI and ISO 9000 include SPICE, Trillium, TickIT, Bootstrap, ITIL, MOF, and CobiT.

What is Six Sigma?

In statistics "sigma" refers to a variation. Six Sigma methodology focuses on reducing variation as a way to improve process. In particular, it uses sigma to measure the performance of a process to produce defect-free results. A defect is anything that causes customer dissatisfaction, such as a product that doesn't meet the customers specifications, poor service, or a price tag that's too high.

The term Six Sigma defines an optimum measurement of quality. A company operating at Six Sigma quality produces 3.4 defects per million opportunities; that is, the work is 99.999966% defect free. Today, most organizations operate between 2 and 3 sigma. Even the best companies in the world operate at the 99% perfect level. 99% perfect (3.8 sigma) is still 6,000 to 66,0000 defects per million.

What is Software Testing Life Cycle?

Usually, testing is considered as a part of the System Development Life Cycle, but it can be also termed as Software Testing Life Cycle or Test Development Life Cycle.
Software Testing Life Cycle consists of the following phases:

1. Planning
2. Analysis
3. Design
4. Execution
5. Cycles
6. Final Testing and Implementation
7. Post Implementation

1. Test Planning (Product Definition Phase):
The test plan phase mainly signifies preparation of a test plan. A test plan is a high level planning document derived from the project plan (if one exists) and details the future course of testing. Sometimes, a quality assurance plan - which is more broader in scope than a test plan is also made.

Contents of a Test Plan are as follows :

* Scope of testing
* Entry Criteria (When testing will begin?)
* Exit Criteria (When testing will stop?)
* Testing Strategies (Black Box, White Box, etc.)
* Testing Levels (Integration testing, Regression testing, etc.)
* Limitation (if any)
* Planned Reviews and Code Walkthroughs
* Testing Techniques (Boundary Value Analysis, Equivalence Partitioning, etc.)
* Testing Tools and Databases (Automatic Testing Tools, Performance testing tools)
* Reporting (How would bugs be reported)
* Milestones
* Resources and Training


Contents of a SQA Plan, more broader than a test plan, are as follows :
The IEEE standard for SQA Plan Preparation contains the following outline :

* Purpose
* Reference Documents
* Management
* Documentation
* Standards, Practices and Conventions
* Reviews and Audits
* Software Configuration Management
* Problem Reporting and Corrective Action (Software Metrics to be used can be identified at this stage)
* Tools, Techniques and Methodologies
* Code Control
* Media Control
* Supplier Control
* Records, Collection, maintenance and Retention

2.Test Analysis (Documentation Phase)

The Analysis Phase is more an extension of the planning phase. Whereas the planning phase pertains to high level plans - the Analysis phase is where detailed plans are documented. This is when actual test cases and scripts are planned and documented.

This phase can be further broken down into the following steps :

* Review Inputs : The requirement specification document, feature specification document and other project planning documents are considered as inputs and the test plan is further disintegrated into smaller level test cases.
* Formats : Generally at this phase a functional validation matrix based on Business Requirements is created. Then the test case format is finalized. Also Software Metrics are designed in this stage. Using some kind of software like Microsoft project, the testing timeline along with milestones are created.
* Test Cases : Based on the functional validation matrix and other input documents, test cases are written. Also some mapping is done between the features and test cases.
* Plan Automation : While creating test cases, those cases that should be automated are identified. Ideally those test cases that are relevant for Regression Testing are identified for automation. Also areas for performance, load and stress testing are identified.
* Plan Regression and Correction Verification Testing : The testing cycles, i.e. number of times that testing will be redone to verify that bugs fixed have not introduced newer errors is planned.


3. Test Design (Architecture Document and Review Phase):
One has to realize that the testing life cycle runs parallel to the software development life cycle. So by the time, one reaches this phase - the development team would have created some code or at least some prototype or minimum a design document would be have been created.

Hence in the Test Design (Architecture Document Phase) - all the plans, test cases, etc. from the Analysis phase are revised and finalized. In other words, looking at the work product or design - the test cases, test cycles and other plans are finalized. Newer test cases are added. Also some kind of Risk Assessment Criteria is developed. Also writing of automated testing scripts begin. Finally - the testing reports (especially unit testing reports) are finalized. Quality checkpoints, if any, are included in the test cases based on the SQA Plan.


4. Test Execution (Unit / Functional Testing Phase):
By this time. the development team would have been completed creation of the work products. Of Course, the work product would still contain bugs. So, in the execution phase - developers would carry out unit testing with testers help, if required. Testers would execute the test plans. Automatic testing Scripts would be completed. Stress and performance Testing would be executed. White box testing, code reviews, etc. would be conducted. As and when bugs are found - reporting would be done.


5. Test Cycle (Re-Testing Phase):
By this time, minimum one test cycle (one round of test execution) would have been completed and bugs would have been reported. Once the development team fixes the bugs, then a second round of testing begins. This testing could be mere correction verification testing, that is checking only that part of the code that has been corrected. It could also be Regression Testing - where the entire work product is tested to verify that correction to the code has not affected other parts of the code.

Hence this process of :
Testing --> Bug reporting --> Bug fixing (and enhancements) --> Retesting
is carried out as planned. Here is where automation tests are extremely useful to repeat the same test cases again and again.
During this phase - review of test cases and test plan could also be carried out.

6. Final Testing and Implementation (Code Freeze Phase):
When the exit criteria is achieved or planned test cycles are completed, then final testing is done. Ideally, this is System or Integration testing. Also any remaining Stress and Performance testing is carried out. Inputs for process improvements in terms of software metrics is given. Test reports are prepared. if required, a test release note, releasing the product for roll out could be prepared. Other remaining documentation is completed.


7. Post Implementation (Process Improvement Phase):
This phase, that looks good on paper, is seldom carried out. In this phase, the testing is evaluated and lessons learnt are documented. Software Metrics (Bug Analysis Metrics) are analyzed statistically and conclusions are drawn. Strategies to prevent similar problems in future projects is identified. Process Improvement Suggestions are implemented. Cleaning up of testing environment and Archival of test cases, records and reports are done.

What is verification & validation?

Q1 What is verification?

Verification ensures the product is designed to deliver all functionality to the customer; it typically involves reviews and meetings to evaluate documents, plans, code, requirements and specifications; this can be done with checklists, issues lists, walkthroughs and inspection meetings. You CAN learn to do verification, with little or no outside help.


Q2 What is validation?

Validation ensures that functionality, as defined in requirements, is the intended behavior of the product; validation typically involves actual testing and takes place after verifications are completed.


Q3 What is a walkthrough?

A walkthrough is an informal meeting for evaluation or informational purposes. A walkthrough is also a process at an abstract level. It's the process of inspecting software code by following paths through the code (as determined by input conditions and choices made along the way). The purpose of code walkthroughs is to ensure the code fits the purpose. Walkthroughs also offer opportunities to assess an individual's or team's competency.


Q4 What is an inspection?

An inspection is a formal meeting, more formalized than a walkthrough and typically consists of 3-10 people including a moderator, reader (the author of whatever is being reviewed) and a recorder (to make notes in the document). The subject of the inspection is typically a document, such as a requirements document or a test plan. The purpose of an inspection is to find problems and see what is missing, not to fix anything. The result of the meeting should be documented in a written report. Attendees should prepare for this type of meeting by reading through the document, before the meeting starts; most problems are found during this preparation. Preparation for inspections is difficult, but is one of the most cost-effective methods of ensuring quality, since bug prevention is more cost effective than bug detection.


Q5 What is quality?

Quality software is software that is reasonably bug-free, delivered on time and within budget, meets requirements and expectations and is maintainable. However, quality is a subjective term. Quality depends on who the customer is and their overall influence in the scheme of things. Customers of a software development project include end-users, customer acceptance test engineers, testers, customer contract officers, customer management, the development organization's management, test engineers, testers, salespeople, software engineers, stockholders and accountants. Each type of customer will have his or her own slant on quality. The accounting department might define quality in terms of profits, while an end-user might define quality as user friendly and bug free.


Q6 What is a Test Case?

A test case is usually a single step, and its expected result, along with various additional pieces of information. It can occasionally be a series of steps but with one expected result or expected outcome. The optional fields are a test case ID, test step or order of execution number, related requirement(s), depth, test category, author, and check boxes for whether the test is automatable and has been automated. Larger test cases may also contain prerequisite states or steps, and descriptions. A test case should also contain a place for the actual result. These steps can be stored
in a word processor document, spreadsheet, database or other common repository. In a database system, you may also be able to see past
test results and who generated the results and the system configuration used to generate those results. These past results would usually be stored in a separate table.



Q7 What is a Test Suite and a Test Script?

The most common term for a collection of test cases is a test suite. The test suite often also contains more detailed instructions or goals for each collection of test cases. It contains a section where the tester identifies the system configuration used during testing. A group of test cases may also contain prerequisite states or steps, and descriptions of the following tests.
Collections of test cases are sometimes incorrectly termed a test plan. They may also be called a test script, or even a test scenario.
A test plan is the approach that will be used to test the system, not the individual tests.
Most companies that use automated testing will call the code that is used their test scripts.


Q8 What is a Test Scenario?

A scenario test is a test based on a hypothetical story used to help a person think through a complex problem or system. They can be as simple as a diagram for a testing environment or they could be a description written in prose. The ideal scenario test has five key characteristics. It is

(a) a story that is
(b) motivating,
(c) credible,
(d) complex, and
(e) easy to evaluate.

They are usually different from test cases in that test cases are single steps and scenarios cover a number of steps. Test suites and scenarios can be
used in concert for complete system tests.

Type of Testing

Acceptance Testing
Testing the system with the intent of confirming readiness of the product and customer acceptance. Acceptance testing, which is a black box testing, will give the client the opportunity to verify the system functionality and usability prior to the system being moved to production. The acceptance test will be the responsibility of the client; however, it will be conducted with full support from the project team. The Test Team will work with the client to develop the acceptance criteria.


Ad Hoc Testing
Testing without a formal test plan or outside of a test plan. With some projects this type of testing is carried out as an adjunct to formal testing. If carried out by a skilled tester, it can often find problems that are not caught in regular testing. Sometimes, if testing occurs very late in the development cycle, this will be the only kind of testing that can be performed. Sometimes ad hoc testing is referred to as exploratory testing.


Alpha Testing
Testing after code is mostly complete or contains most of the functionality and prior to users being involved. Sometimes a select group of users are involved. More often this testing will be performed in-house or by an outside testing firm in close cooperation with the software engineering department.


Automated Testing
Software testing that utilizes a variety of tools to automate the testing process and when the importance of having a person manually testing is diminished. Automated testing still requires a skilled quality assurance professional with knowledge of the automation tool and the software being tested to set up the tests.


Beta Testing
Testing after the product is code complete. Betas are often widely distributed or even distributed to the public at large in hopes that they will buy the final product when it is released.


Black Box Testing
Testing software without any knowledge of the inner workings, structure or language of the module being tested. Black box tests, as most other kinds of tests, must be written from a definitive source document, such as a specification or requirements document.


Compatibility Testing
Testing used to determine whether other system software components such as browsers, utilities, and competing software will conflict with the software being tested.


Configuration Testing
Testing to determine how well the product works with a broad range of hardware/peripheral equipment configurations as well as on different operating systems and software.


End-to-End Testing
Similar to system testing, the 'macro' end of the test scale involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate.


Functional Testing
Testing two or more modules together with the intent of finding defects, demonstrating that defects are not present, verifying that the module performs its intended functions as stated in the specification and establishing confidence that a program does what it is supposed to do.


Independent Verification and Validation (IV&V)
The process of exercising software with the intent of ensuring that the software system meets its requirements and user expectations and doesn't fail in an unacceptable manner. The individual or group doing this work is not part of the group or organization that developed the software. A term often applied to government work or where the government regulates the products, as in medical devices.


Installation Testing
Testing with the intent of determining if the product will install on a variety of platforms and how easily it installs. Testing full, partial, or upgrade install/uninstall processes. The installation test for a release will be conducted with the objective of demonstrating production readiness. This test is conducted after the application has been migrated to the client's site. It will encompass the inventory of configuration items (performed by the application's System Administration) and evaluation of data readiness, as well as dynamic tests focused on basic system functionality. When necessary, a sanity test will be performed following the installation testing.


Integration Testing
Testing two or more modules or functions together with the intent of finding interface defects between the modules or functions. Testing completed at as a part of unit or functional testing, and sometimes, becomes its own standalone test phase. On a larger level, integration testing can involve a putting together of groups of modules and functions with the goal of completing and verifying that the system meets the system requirements. (see system testing)


Load Testing
Testing with the intent of determining how well the product handles competition for system resources. The competition may come in the form of network traffic, CPU utilization or memory allocation.


Parallel/Audit Testing
Testing where the user reconciles the output of the new system to the output of the current system to verify the new system performs the operations correctly.


Performance Testing
Testing with the intent of determining how quickly a product handles a variety of events. Automated test tools geared specifically to test and fine-tune performance are used most often for this type of testing.


Pilot Testing
Testing that involves the users just before actual release to ensure that users become familiar with the release contents and ultimately accept it. Often is considered a Move-to-Production activity for ERP releases or a beta test for commercial products. Typically involves many users, is conducted over a short period of time and is tightly controlled. (see beta testing)


Recovery/Error Testing
Testing how well a system recovers from crashes, hardware failures, or other catastrophic problems.


Regression Testing
Testing with the intent of determining if bug fixes have been successful and have not created any new problems. Also, this type of testing is done to ensure that no degradation of baseline functionality has occurred.


Sanity Testing
Sanity testing will be performed whenever cursory testing is sufficient to prove the application is functioning according to specifications. This level of testing is a subset of regression testing. It will normally include a set of core tests of basic GUI functionality to demonstrate connectivity to the database, application servers, printers, etc.


Security Testing
Testing of database and network software in order to keep company data and resources secure from mistaken/accidental users, hackers, and other malevolent attackers.


Software Testing
The process of exercising software with the intent of ensuring that the software system meets its requirements and user expectations and doesn't fail in an unacceptable manner. The organization and management of individuals or groups doing this work is not relevant. This term is often applied to commercial products such as internet applications. (contrast with independent verification and validation)


Stress Testing
Testing with the intent of determining how well a product performs when a load is placed on the system resources that nears and then exceeds capacity.


System Integration Testing
Testing a specific hardware/software installation. This is typically performed on a COTS (commercial off the shelf) system or any other system comprised of disparent parts where custom configurations and/or unique installations are the norm.


Unit Testing
Unit Testing is the first level of dynamic testing and is first the responsibility of the developers and then of the testers. Unit testing is performed after the expected test results are met or differences are explainable / acceptable.


Usability Testing
Testing for 'user-friendliness'. Clearly this is subjective and will depend on the targeted end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Programmers and testers are usually not appropriate as usability testers.


White Box Testing
Testing in which the software tester has knowledge of the inner workings, structure and language of the software, or at least its purpose.

Friday, November 24, 2006

Eye Care

This info' is sure to help PC users very much.During a recent visit to an optician, one of my mail friends was told of an exercise for the eyes by a specialist doctor in the US that he termed as 20-20-20." It is apt for all of us, who spend long hours at our desks, looking at the computer screen. I Thought I'd share it with you. 20-20-20

Step I

:-After every 20 minutes of looking into the computer screen, turn your head and try to look at any object placed at least 20 feet away. This changes the focal length of your eyes, a must-do for the tired eyes.

Step II

:- Try and blink your eyes for 20 times in succession, to moisten them.

Step III

:- Time permitting of course, one should walk 20 paces after every 20 minutes of sitting in one particular posture. Helps blood circulation for the entire body.

Circulate among your friends if you care for them and their eyes. They say that your eyes r mirror of your soul, so do take care of them, they are priceless................

Otherwise our eye would be like this.....