quality Articles / Blogs / Perficient https://blogs.perficient.com/tag/quality/ Expert Digital Insights Mon, 20 Jan 2025 05:15:35 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png quality Articles / Blogs / Perficient https://blogs.perficient.com/tag/quality/ 32 32 30508587 iCEDQ – An Automation Testing Tool https://blogs.perficient.com/2024/07/23/icedq-an-automation-testing-tool/ https://blogs.perficient.com/2024/07/23/icedq-an-automation-testing-tool/#comments Tue, 23 Jul 2024 22:00:22 +0000 https://blogs.perficient.com/?p=332235

Data Warehouse/ETL Testing

Data warehouse testing is a process of verifying data loaded in a data warehouse to ensure the data meets the business requirements. This is done by certifying data transformations, integrations, execution, and scheduling order of various data processes.

Extract, transform, and load (ETL) Testing is the process of verifying the combined data from multiple sources into a large, central repository called a data warehouse.

Conventional Testing tools are designed for UI-based applications, whereas a data warehouse testing tool is purposefully built for data-centric systems and designed to automate data warehouse testing and generating results. It is also used during the development phase of DWH.

iCEDQ

Integrity Check Engine For Data Quality (iCEDQ) is one of the tools used for data warehouse testing which aims to overcome some of the challenges associated with conventional methods of data warehouse testing, such as manual testing, time-consuming processes, and the potential for human error.

It is an Automation Platform with a rules-based auditing approach enabling organizations to automate various test strategies like ETL Testing, Data Migration Testing, Big Data Testing, BI Testing, and Production Data Monitoring.

It tests data transformation processes and ensures compliance with business rules in a Data Warehouse.

Qualities of iCEDQ

Let us see some of the traits where testing extends its uses.

Automation

It is a data testing and monitoring platform for all sizes of files and databases. It automates ETL Testing and helps maintain the sanctity of your data by making sure everything is valid.

Design

It is designed with a greater ability to identify any data issues in and across structured and semi-structured data.

Uniqueness

Testing And Monitoring:

Its unique in-memory engine with support for SQL, Apache Groovy, Java, and APIs allows organizations to implement end-to-end automation for Data Testing and Monitoring.

User Friendly Design:

This tool provides customers an easy way to set up an automated solution for end-to-end testing of their data-centric projects and it provides Email Support to its customers

Supported Platforms:

Mostly widely used by Enterprises and Business Users and used in platforms like Web apps and Windows. Does not support MAC, Android, and IOS.

Execution Speed:

New Big Data Edition test 1.7 Billion rows in less than 2 minutes and Recon Rule with around 20 expressions for 1.7 billion rows in less than 30 minutes.

With a myriad of capabilities, iCEDQ seamlessly empowers users to automate data testing, ensuring versatility and reliability for diverse data-centric projects.

Features:

  • Performance Metrics and Dashboard provides a comprehensive overview of system performance and visualizes key metrics for enhanced monitoring and analysis.
  • Data Analysis, Test and data quality management ensures the accuracy, reliability, and effectiveness of data within a system.
  • Testing approaches such as requirements-based testing and parameterized testing involve passing new parameter values during the execution of rules.
  • Move and copy test cases and supports parallel execution.
  • The Rule Wizard automatically generates a set of rules through a simple drag-and-drop feature, reducing user effort by almost 90%.
  • Highly scalable in-memory engine to evaluate billions of records.
  • Connect to Databases, Files, APIs, and BI Reports. Over 50 connectors are available.
  • Enables DataOps by allowing integration with any Scheduling, GIT, or DevOps tool.
  • Integration with enterprise products like Slack, Jira, ServiceNow, Alation, and Manta.
  • Single Sign-On, Advanced RBAC, and Encryption features.
  • Use the built-in Dashboard or enterprise reporting tools like Tableau, Power BI, and Qlik to generate reports for deeper insights.
  • Deploy anywhere: On-Premises, AWS, Azure, or GCP.

Testing with iCEDQ:

ETL Testing:

There are few data validations and reconciliation the business data and validation can be done in ETL/Big data testing.

  • ETL Reconciliation – Bridging the data integrity gap
  • Source & Target Data Validation – Ensuring accuracy in the ETL pipeline
  • Business Validation & Reconciliation – Aligning data with business rules

Migration Testing:

iCEDQ ensures accuracy by validating all data migrated from the legacy system to the new one.

Production Data Monitoring:

iCEDQ is mainly used for support projects to monitor after migrating to the PROD environment. It continuously monitors ETL jobs and notifies the data issues through a mail trigger.

Why iCEDQ?

Reduces project timeline by 33%, increases test coverage by 200%, and improves productivity by 70%.

Pros & Cons V1

In addition to its automation capabilities, iCEDQ offers unparalleled advantages, streamlining data testing processes, enhancing accuracy, and facilitating efficient management of diverse datasets. Moreover, the platform empowers users with comprehensive data quality insights, ensuring robust and reliable Data-Centric project outcomes.

Rule Types:

Users can create different types of rules in iCEDQ to automate the testing of their Data-Centric projects. Each rule performs a different type of test cases for the different datasets.

Rules

By leveraging iCEDQ, users can establish diverse rules, enabling testing automation for their Data-Centric projects. Tailoring each rule within the system to execute distinct test cases caters to the specific requirements of different datasets.

iCEDQ System Requirements

iCEDQ’s technical specifications and system requirements to determine if it’s compatible with the operating system and other software.

Icedq_Details

To successfully deploy iCEDQ, it is essential to consider its system requirements. Notably, the platform demands specific configurations and resources, ensuring optimal performance. Additionally, adherence to these requirements guarantees seamless integration, robust functionality, and efficient utilization of iCEDQ for comprehensive data testing and quality assurance.

Hence, iCEDQ is a powerful Data Mitigation and ETL/Data Warehouse Testing Automation Solution designed to give users total control over how they verify and compare data sets. With iCEDQ, they can build various types of tests or rules for data set validation and comparison.

Resources related to iCEDQ – https://icedq.com/resources

]]>
https://blogs.perficient.com/2024/07/23/icedq-an-automation-testing-tool/feed/ 1 332235
Why Healthcare Should Consider South America for Quality Offshore Development https://blogs.perficient.com/2020/08/03/why-healthcare-should-consider-south-america-for-quality-offshore-development/ https://blogs.perficient.com/2020/08/03/why-healthcare-should-consider-south-america-for-quality-offshore-development/#respond Mon, 03 Aug 2020 11:00:02 +0000 https://blogs.perficient.com/?p=278184

As many who follow Perficient know, we recently completed our largest acquisition to date with bringing PSL on board.  Our healthcare practice is excited for a number of reasons that I want to explore but it boils down to quality offshore development. We think it makes a lot of sense for Healthcare Organizations (HCO) to look closely at an offshore practice like PSL.  But first, let me give some highlights before diving into why we think this makes sense for Perficient and for our Healthcare clients.

By the Numbers

In this day and age, you don’t buy a fledgling offshore firm. That’s especially true when Perficient already has significant quality capability in our various India and China offices.  So you want quality and scale and that’s what brought us to them

Psl Numbers

I want to highlight that the scale of people and the scale of technologies is key here.  Perficient takes on work we can succeed at and that only works if you have the necessary skills.

Psl Numbers 2

Then of course there’s the quality side of things. Our first offshore office in China was the first that we know to achieve CMMI Level 5 for Agile certification in the world.  PSL was the first to achieve it in Latin America and was the 8th in the world.  The focus on quality makes a difference.

By Alignment

Anyone who has been through a merger knows that cultural fit and alignment are kings.  When you take two different business cultures and push them together, success only comes if they can merge together in shared values and shared goals.  If you have widely divergent goals and values then you increase the likelihood of failure.  Now think about what this means for consulting projects in general and offshore projects in particular.  You must create those same share values and goals. The first time I heard from PSL and then the first time I spoke to them the following came shared value came through

  1. Focus on Quality Offshore Development.  I mentioned that they too received CMMI certification very early on.  We share that not because we love the certification but because of what the certification brought us
  2. Success.  Every one says they want success but I heard on sentence from PSL that we at Perficient use quite a bit.  “We have told a client we won’t do certain work if it’s not within our skill set.”  In that particular case 2/3 of the project mentioned was alignd but we didn’t focus on one technology that was key to the project.  They walked away.  We’ve done that ourselves rather than take on something where we would have a high likelihood of failing out client.
  3. Evolution.  Over time, Perficient has reinvented itself several times. PSL seems to have done that as well.  They evolve technical capabilities. They focus on improving their employees skills. They focus on constant internal improvement with Agile methodologies, testings, and DevOps.
  4. Culture.  In my discussions it was obvious the cultural fit was high.  It makes for an easier understanding between our teams and with our clients

What It Means to Healthcare

You’ll notice that I have yet to discuss a single technology capability yet.  That will come a little later.  In reality, we’ve learned that just having offshore capability doesn’t lead to success.  You have to set the offshore project up for success.  There are best practices.  Both the client and the consulting firm have to invest in it.  That’s why we are excited about what our new capabilities with PSL will bring value to our healthcare clients.  Here’s a couple reasons

  1. Our partners in Colombia have a consulting mind set.  The first answer to the question, “Will you do this” isn’t necessarily yes. Sometimes you have to help clients understand the pros and cons of what they want to do to ensure they make the right decisions.  This leads to quality offshore development
  2. Time zone.  Being in the same time zone as the US (Eastern) means it’s easier to communicate.  Since communication is one of those important ingredients then we can work less hard at it
  3. Wide range of technical capabilities.  This includes testing, devops, integration, data, and cloud.  If there’s a common theme we see among our healthcare clients it’s that they are starting to move towards cloud and agile in a big way.  That includes the use of cloud for data and integration capabilities.  The alignment of skills makes a lot of sense
  4. Lower overall costs. Like other offshore centers, our Colombia GDC has significantly lower costs than what you find in the United Stated.  This helps when you only have so much budget but you still want to maintain high quality
  5. Product and Agile mind set.  Many of the Colombia GDC clients use them to create and improve products.  In many ways they are the product development team.  Now you might wonder why is that relevant to healthcare?  Many of our more innovative clients in the payer, provider, and medical device worlds are actively creating products that they want to use internally and market to others.  Our Colombia GDC bring the right mindset and experience to do this.

I’ll focus the next post on more in depth details on how our Colombia GDC aligns to healthcare.

]]>
https://blogs.perficient.com/2020/08/03/why-healthcare-should-consider-south-america-for-quality-offshore-development/feed/ 0 278184
Are you good enough ? https://blogs.perficient.com/2017/09/18/are-you-good-enough/ https://blogs.perficient.com/2017/09/18/are-you-good-enough/#respond Mon, 18 Sep 2017 19:33:48 +0000 http://blogs.perficient.com/delivery/?p=9247

I saw this at a restaurant and could immediately relate it to IT industry.

If you are not HAPPY with your PRODUCT, DON’T Put it On the PLATE !!

Quality is a collective responsibility. Everybody from Pre-Sales to Implementation have equal accountability.
But the core lies with the Technology Team –
As a Business Consultant, if you are not happy with your requirements, why and how can you complete it?
As a Technical Consultant, if you are not happy with your code, why and how can you push the code?
As a Testing Consultant, if you are not happy with your test results, why and how can you pass the test?

One should not deliver only good ingredients, they should also deliver a good product.
Do you agree?

]]>
https://blogs.perficient.com/2017/09/18/are-you-good-enough/feed/ 0 210972
Automation Testing – What to Expect and What Not to Expect https://blogs.perficient.com/2017/05/30/automation-testing-what-to-expect-and-what-not-to-expect/ https://blogs.perficient.com/2017/05/30/automation-testing-what-to-expect-and-what-not-to-expect/#respond Tue, 30 May 2017 12:51:53 +0000 http://blogs.perficient.com/delivery/?p=7935

Myth 1: Automation replaces Manual Testers.

Reality: After the scripts are developed, the common misconception is that we can replace the manual testers with Automation, but this is not true. Automation is a program which will test the flow of the system. Even a small defect can be missed if we don’t write scripts to look for it, but this can always be found during manual test execution. Also, if the application being tested undergoes changes, this often results in the failure of scripts and the manual tester must test the changes until the scripts are updated.

Even though automation reduces the amount of time taken by manual testing, one needs to spend time on test results analysis to make sure that automation has not missed out any critical defects.

Solution:

  • We always need a human brain to analyze the test results. Robots cannot replace humans in automation testing.
  • For applications, which are changing frequently, automation testing can only be used as a reference and not as a replacement for a manual tester.
  • Automation should be used for static applications which are independent of other modules and which need to be regression tested.

Myth 2: Automation executes fast whereas manual testing executes slow.

Reality: Automation runs faster than manual testing but sometimes when the properties involved for the object are not straight forward, then automation testing would take time in identifying those objects.

For example: If a page has multiple headers and has similar properties, then automation might take some time in identifying those headers whereas manual tester does an easy check for its existence.

Preparation of Test data also involves time because it involves writing a logic for Automation testing while manual tester does it faster because they know the test data to choose for the application which in turn increases the execution speed.

While testing, applications communicating with multiple systems both manual and automation takes the same time to execute because of the dependency on external system.

Solution:

 One must take an account of the time needed for execution, the complexity of the application and number of external system involved. All these must be done during requirement analysis. A feasibility study must be done on the time taken for execution and effort spent in automating application.

  • All the stakeholders should be made aware of the automation effort involved.
  • The estimation effort should be calculated by taking the following factors into consideration: Objects complexity, flow of the system, time needed to build the scripts and lastly the execution time of automation verses manual testing.

Myth 3: Users just need to click a button to execute the automated test cases.

Reality: When automation is completed and delivered to end users, they assume that scripts can be executed without any change in test data but this happens only in a rare situation. End users should also spend their time in execution. They should be aware of the flows of the application, keywords used in the application and the test data needs to be given.

Solution:

 We should have a periodic discussion about the test case flow and test data to the end users which will avoid wrong expectations.

  • We should also understand the test data used which can be constant and which can be used as an input. This will minimize the number of data inputs needed for each test case.

Myth 4: Expecting 100% automation test execution without any failures.

Reality: Automation is also like application development with lot of limitations and hence executing all scripts without errors is impossible. There can be many reasons like slowness of system, Environment outage, data issues, UI changes etc.

Solution:

 End users should understand the reason of the failure before passing it to the automation team. This helps in understanding the flows of the system and the inputs needed for it rather than depending on automation team.

  • Automation is ideal for stable systems or for the systems for which the development is complete.
  • We must make sure all test cases are updated with latest data and there is no factor affecting the system execution.

Myth 5: When automating a new application, one can study the existing framework and this doesn’t need time to invest on.

Reality: Practically, Automation is not easy. It needs a lot of time, money and mainly patience. If used in a right way, automation can help organization in a big way. Testers need to understand the domain, the test cases to be automated, choosing the right framework to build a strong foundation in automation.

Automation is also like an application development which needs a lot of testing. The automation scripts must be tested thoroughly with all set of test data both positive and negative. Improper testing or partly tested tool results in failure of scripts and leads to the lack of confidence in the tool.

Solution:

  • Understanding the application thoroughly which needs to be automated would give a better clarity to the scripts.
  • While setting the timeline for automating the scripts, the time needed for requirement gathering, design, coding and testing should also be considered.
  • Discuss with Manual testers and developers on the key areas before deciding the suitable framework and tool.

Conclusion:

Automation testing can deliver long-term benefits but not suitable for immediate results as the scripts should be constantly updated and maintained. We should also understand that automation is not only a magic tool to meet the deadlines of execution but also has its own limitations hence automation testing is highly effective and can yield best results when it is combined with manual testing.

]]>
https://blogs.perficient.com/2017/05/30/automation-testing-what-to-expect-and-what-not-to-expect/feed/ 0 210902
Agile & CMMi – And they lived together happily ever after! https://blogs.perficient.com/2017/01/02/agile-cmmi-and-they-lived-together-happily-ever-after/ https://blogs.perficient.com/2017/01/02/agile-cmmi-and-they-lived-together-happily-ever-after/#comments Mon, 02 Jan 2017 12:49:54 +0000 http://blogs.perficient.com/delivery/?p=6709

Fairy tales have never lost their charm and they have always stayed close and strong in the hearts of people. I have made a maiden attempt to explain the compatibility of Agile and CMMi model as a fairy tale and this is how it goes:

Once upon a time, there was a software town. The town’s people were efficient in building software projects. In that town, there was a girl named Agile. She was a simple, no-nonsense girl. She valued:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

She was robust enough to manage any change at any phase of a software project and that too with minimal effort.

All girls in software town were after a handsome guy called CMMi model. He prescribed a set of best practices to be followed in software projects. He groomed the software projects from an ad-hoc management level to an optimized management level.

To be appraised as a compatible partner of CMMi model, was every girl’s dream in software town.

One day Agile was walking along with her peers Waterfall and RAD and they made fun of her that she will not be compatible with CMMi Model with her simple nature.

Agile was in tears, she cried and prayed. A fairy appeared and she said:
“If we can prove some myths as untrue, you can work your way to align with CMMi Model”:

Myth 1: CMMi model and Agile are hard to be aligned.
Myth Buster: CMMI and Agile complement each other as CMMi speaks What to do? and Agile speaks How to do? And together, they guide well in doing the project right and ending up with the right deliverable.

Myth 2: A right balance cannot be struck between the Agile-given flexibility and motivation to projects and practitioners and the CMMi model-given organizational learning and improvement opportunities.
Myth Buster: A right balance when struck between Agile and CMMi model, energizes the organization to improvise and innovate better. The balance can be struck by automating and leveraging appropriate tools for project processes.

Myth 3: CMMi Model has to be applied to projects.
Myth Buster: CMMi describes a set of best practices and has to be taken as a reference for projects. CMMi is not a formula for projects but a reference that has to be implemented in projects. Projects are supposed to learn and relate the model to their actual scenarios.

Myth 4: Adherence to CMMi model involves a lot of documentation for the projects, while Agile projects does not need documentation.
Myth Buster: Agile does value documentation but a bit less than the working software. With proper interpretation of CMMi model’s needs, optimal documentation can be insisted to suffice the needs of Agile and CMMi model.

Myth 5: CMMi high maturity insists on precise predictability, while Agile insists on reaction to changes.
Myth Buster: Actually CMMi does not ask for an absolute value of prediction, but at least a prediction range that will allow the project to take informed decisions. Better process management is the key to cater to CMMi and Agile together.

Boom…. The myths are not true anymore…
Then the fairy said, “Let’s get you ready my Agile lady, you need some tools to evidence your process activities better”

The fairy added some tools to the Agile projects. The tools took care of configuration management, metrics calculation, test-driven requirement management, continuous review, continuous testing and continuous integration. Now almost all of Agile’s process activities could be demonstrated and evidenced.

Agile presented herself to CMMi and CMMi was very much impressed by Agile’s fresh approach. CMMi was happy with Agile’s tools and performance, and appraised Agile as a compatible partner.

Agile & CMMi lived together happily ever after.

References:
CMMI® or Agile: Why Not Embrace Both! – CMU/SEI-2008-TN-003
Real experiences while implementing CMMi Level 5 practices in Perficient Chennai

]]>
https://blogs.perficient.com/2017/01/02/agile-cmmi-and-they-lived-together-happily-ever-after/feed/ 10 210845
Perficient’s Top 10 Life Sciences Blog Posts of 2014, Letterman-Style https://blogs.perficient.com/2014/12/24/perficients-top-10-life-sciences-blog-posts-of-2014-letterman-style/ https://blogs.perficient.com/2014/12/24/perficients-top-10-life-sciences-blog-posts-of-2014-letterman-style/#respond Wed, 24 Dec 2014 15:39:10 +0000 http://blogs.perficient.com/lifesciences/?p=703

As we wrap up 2014, I thought it would be neat to see what our readers were, well, reading. Without further ado, here are the top 10 blog posts Perficient’s life sciences practice wrote that seemed to be popular among readers. They’re ranked David Letterman-style, one being most popular (i.e., viewed).

top-10-life-sciences-blog-posts

  1. The One Feature CROs and AROs Love in Argus Safety
  2. Wait, You’re Not Using an Investigator Portal?
  3. The Key To Delivering A CTMS Project On Time and On Budget
  4. Oops! I Just Deleted Production Data For An EDC Study
  5. How Many Environments Do You REALLY Need?
  6. Collecting and Reporting Adverse Events in Excel
  7. The CTMS Report Your Boss Wants to See
  8. This Recent FDA Warning Letter Shows You Exactly Why Quality Matters
  9. 45+ Documents That Life Sciences Companies Can Sign With Digital Signatures
  10. Should You Be Afraid Of Ebola?

On a personal note, thank you very much for your continued readership!

]]>
https://blogs.perficient.com/2014/12/24/perficients-top-10-life-sciences-blog-posts-of-2014-letterman-style/feed/ 0 189312
This Recent FDA Warning Letter Shows You Exactly Why Quality Matters https://blogs.perficient.com/2014/12/17/this-recent-fda-warning-letter-shows-you-exactly-why-quality-matters/ https://blogs.perficient.com/2014/12/17/this-recent-fda-warning-letter-shows-you-exactly-why-quality-matters/#respond Wed, 17 Dec 2014 15:10:03 +0000 http://blogs.perficient.com/lifesciences/?p=633

“Quality, quality, quality.” You hear it all the time from your QA colleagues. There are so many forms to fill out, trainings to complete, procedures to read and follow, and internal audits life-sciences-qualityto produce records for. Sometimes, it’s so overwhelming that we’re tempted to just tune it out.

That is, until we read an FDA Warning Letter like the one issued on November 24 to a Florida-based medical device company. The FDA cites 11 separate violations related to the company’s quality management system (QMS)…and its gaps.

Where is your organization on its blockchain journey?

AnswerTotal NumberTotal %
Not yet started1300.62
Identifying use cases550.26
Competing a POC90.04
Building a prototype70.03
Doing a pilot50.02
Planning to scale30.01
Total Responses: 209 of 428 (49%)

Violations 9, 10, and 11 are related to failing to follow FDA-mandated reporting procedures for medical devices.

Poll Results for What industry are you in?
Start Time: April 3, 2018 12:13:22 PM MDT

Total Responses: 222 of 428 (52%)

Results Summary
AnswerTotal NumberTotal %
Automotive70.03
Energy and Utilities140.06
Financial Services and Insurance470.21
Healthcare and Life Sciences600.27
Manufacturing140.06
Retail220.1
Other580.26


While some of the content of the letter might make you chuckle, like the President admitting to false advertising, the fact that they couldn’t produce their device risk assessment in Englishvolume (despite being U.S.-based), or the fact they’re 2.5 years late in reporting product complaints, the truth is that we’re all guilty of quality violations from time to time. Hopefully, not nearly as extreme as this particular company, but nobody’s perfect.

I propose that we use this Letter as a reminder to turn inward, take a look at ourselves, and make sure we’re genuinely doing all that we can to protect the public. Instead of tuning out quality, let’s crank up the volume.

For an assessment or a mock audit of your quality management system (QMS), contact us.

]]>
https://blogs.perficient.com/2014/12/17/this-recent-fda-warning-letter-shows-you-exactly-why-quality-matters/feed/ 0 189306
Automating REST services using Soap UI Pro https://blogs.perficient.com/2014/12/02/automating-rest-services-using-soap-ui-pro/ https://blogs.perficient.com/2014/12/02/automating-rest-services-using-soap-ui-pro/#respond Tue, 02 Dec 2014 11:22:10 +0000 http://blogs.perficient.com/delivery/?p=3223

“A Web service is a method of communication between two electronic devices over a network. It is a software system designed to support interoperable machine-to-machine interaction over a network.”

There are various tools available to test web services. Some of them are Soap UI, Soap UI Pro, Test Maker, Web Inject etc. The most common tool we use is Soap UI.

Soap-UI Pro (the licensed version) has come up with a user friendly UI. It gives utility to create the test data (to read/write from external files) step, create a data connection (to read/write from database tables) step and property Transfer (to transfer property between steps) step. It also provides a utility called Groovy script to achieve any validations/operations not possible with default SOAP UI steps.

Let’s look at the task that needs to be performed:-

Objective: – To automate a sample login (authenticateMember) REST web service using Soap UI Pro.

Resources: – Soap UI Pro tool, Input Excel data file, Output Excel data file.

Solution: – I will be using pre-defined soap UI steps to read/write data from external excel file and groovy script to perform few validations/operations those are not possible with the default steps.

Step 1:- Prepare the Input Excel data file (Sheet Name- Login)

InputDataSheet

InputDataSheet

Assumption is that the service accepts username and password; we are passing both the parameters from the test data input sheet.

Step 2:- Prepare the Output Excel data file (Sheet Name – Login)

OutputDataSheet

OutputDataSheet

Assumption is that the service returns statusCode and statusDesc. Let’s see what the extra output fields are:-

  • testResult – Insert either of Pass/Fail based on assertions.
  • statusCode – Insert the status code returned from the REST response.
  • statusDesc – Insert the status description returned from the REST response.
  • Request – Insert the raw REST request.
  • Response – Insert the complete REST response returned from the service.

Step 3:- Create the Automation test suite.

  • Open Soap UI Pro, import your project.
  • Go to the webservice endpoint and right click to see the context menu.
  • Click on option “Generate TestSuite”.
Context Options

Context Options

  • In the Generate Test Suite pop-up, check the checkbox authenticateMember (sample service name) service and click OK.

 

Add Service Request

Add Service Request

  • Once done, we will be able to see the test suite (authenticateMember) generated with the selected service.

ProjectTree-1

 

Step 4:- Create testcase under test suite.

  •  Go to the created test suite and right click to see the context menu.
  •  Click on option “New TestCase”.
Context Options

Context Options

  • In the New Test Case pop-up, give the name of the new testcase and click OK.
NewTestCase

NewTestCase

 

  • Once done, we will be able to see the test case (AuthenticateMember) generated.
AuthenticateMember

AuthenticateMember

 

Step 5:- Create test steps under created test case.

(i)  Create DataSource step to read username and password from the test data input sheet

  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

 

 

  • Click on option “DataSource”.
  • In the Add Step pop-up, give the name of the step and click OK.
DataSource

DataSource

  • Once done, we will be able to see the test step (DataSource) generated.
DataSource Step

DataSource Step

 

  • In the DataSource step, click on the dropdown next to DataSource label and select “Excel”.
DataSource Configuration

DataSource Configuration

 

  • Since we need 2 values (Username and Password), we will be creating 2 properties in this step to capture the values from the input data excel file. To create property, click on “+” sign on the left corner, name the property name and click OK.
AddProperty

AddProperty

 

 

  • Create properties with name “Username” and “Password”.
DataSource with properties created

DataSource with properties created

 

  • Now we need to browse for out test input data sheet in the File: text box, give the proper sheet name in Worksheet text box and give the cell number where Username and Password cells are located in the test input data sheet.
  • Once done run the test step, to run click on the green coloured Play button. We will be able to see the properties populated with the values we stored in test input data sheet.
DataSource executed

DataSource executed

 

(ii)  Create REST request Test Step

  • Assuming that authenticateMember REST request takes Username and Password as test input.
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

  • Click on option “REST Test Request
  • In the Add Step pop-up, give the name of the test step and click OK
REST Test Request

REST Test Request

  • Once done, we will be asked to select the REST request which we want to import. Select the appropriate REST request(depending on the project) from the dropdown and click OK
New Rest Request

New Rest Request

 

 

  • Once done, we will be able to see REST Test Request test step created.
REST Test Request created

REST Test Request created

 

(iii)  Parameterize the input data in created REST Test Request test step.

  • Go to the Request tab where we need to enter Username and Password.
REST Test Request with request parameters

REST Test Request with request parameters

 

 

  • To remove the hard coded values; right click on “admin” so that context menu appears and selects the value; GetData-> DataSource -> Username. Do the same for Password.
ParameterizeUsername

ParameterizeUsername

 

 

  • Once done, we will be able to see the input data has been parameterized.
Parameterized

Parameterized

(iv)  Execute the REST request and create assertions.

  • Assuming that if we pass valid Username and Password, statusCode=0 and statusDesc=Success is returned.
  • Hence we need to create assertions for the above two fields.
  • Go to statusCode field in REST response and right click so that context menu appears.
REST Response

REST Response

 

 

  • Select AddAssertion -> forContent.
Add Assertion

Add Assertion

  • Verify Xpath for the source node of “statusCode” and Expected Result
Xpath Match Config

Xpath Match Config

 

 

  • Do the same for statusDescription.
  • Once done, we will be able to see 2 Assertions created at the left bottom of REST request step.
CreatedAssertions

CreatedAssertions

(v) Create Groovy script to validate the assertions.
  • We will be writing a customized groovy code to validate the assertions whether they are PASSED or FAILED. If they are passed then we will insert “Passed” into the execution result column in output data sheet. Else insert “Fail” if any one of the assertions failed.
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

  • Click on option “Groovy Script
  • In the Add Step pop-up, give the name of the test step and click OK.
Groovy Script

Groovy Script

  • Once done, we will be able to see the test step (Groovy Script) generated.
Groovy Script step

Groovy Script step

(vi)  We will be writing code to validate the assertions.

//import class for assertion

import com.eviware.soapui.model.testsuite.Assertable

//Create variables

def testStepSrc = testRunner.testCase.getTestStepByName(“REST Test Request”)

def propertyVal = testRunner.testCase.getTestStepByName(‘DataSink’)

// Count number of assertions created in REST request

def counter = testStepSrc.getAssertionList().size()

// if status matches string “VALID”, then pass else fail

for(i=0;i<counter;i++)

{

String status=testStepSrc.getAssertionAt(i).getStatus()

if(status!=’VALID’)

{

output=’FAIL’

}

else

{

output=’PASSED’

}

}

testRunner.testCase.testSuite.setPropertyValue(“result”, output)

//Create a property Status and put the value either pass/fail in it depending on the assertions status

propertyVal.setPropertyValue(“Status”,output)

 

(vi)  Create DataSink step

  • DataSink step is required to setp up property values and the same can be directly inserted into the output data sheet.
  • Since we will be inserting 5 values (result, statusCode, statusDescription, request and response), we will be creating 5 properties in DataSink step and insert the same into output data sheet.
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

 

  • Click on option “DataSink” option.
  • In the Add Step pop-up, give the name of the test step and click OK
Data Sink

Data Sink

 

  • Once done, we will be able to see DataSink step created.
Data Sink step

Data Sink step

 

 

  • Create 5 properties and the values will get supplied from the propertyTransfer step (this will be added in the next step).
Create Data Sink properties

Create Data Sink properties

 

  • To create the properties follow the same approach as we followed while creating properties in DataSource
  • Enter the Configuration parameters; here we will be passing the path of output file, the specific worksheet and cell number on which we want to insert the values.
  • No need to give values to the created properties, this will get inserted from propertyTransfer step.

 

(vii)  Create PropertyTransfer

  • Property transfer is required to transfer property values from one step to another. We will be transferring property values from REST response and GroovyScript to Data Sink
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

  • Click on option “Property Transfer” option.
  • In the Add Step pop-up, give the name of the test step and click OK
Property Transfer

Property Transfer

 

  • Once done, we will be able to see Property Transfer step created.
Property Transfer step

Property Transfer step

 

 

  • We will be creating 4 property values (statusCode, statusDescription, request and response). The property status will be populated to DataSink step from GroovyScript.
Property Transfer created

Property Transfer created

  • After we are done creating the properties, we need to select Source, Property and Target, Property. This will enable transferring of properties. To transfer statusCode property

 

Source – REST Test Request

Property – ResponseAsXML

Xpath – node description of statusCode

 

Target – Data Sink

Property – statusCode

 This will transfer the statusCode value to DataSink’s statusCode property. Do the same for the other properties as well.

 

  • After you are done, execute this step. Open DataSink step and execute, we will be able to see property values for statusCode, statusDescription, request and response.
Data Sink step populated

Data Sink step populated

  • This is how the project flow will look like. Because the steps are dependent on one another hence proper flow is mandatory for the execution to go through.
ProjectTree

ProjectTree

 

 

Step 6:- Execute the test suite

  • Execute the testsuite by clicking on the Play button for the test suite.
Test Suite

Test Suite

 

 

  • After we are done, verify the DataSink We will be able to see all the property values.
Data Sink populated

Data Sink populated

 

 

  • Verify the output Data sheet, The data has been inserted to the output data file starting from the cell we mentioned in the DataSink step.
OutputDataSheet (After execution)

OutputDataSheet (After execution)

 

]]>
https://blogs.perficient.com/2014/12/02/automating-rest-services-using-soap-ui-pro/feed/ 0 210692
Configuration of Sahi Automation tool https://blogs.perficient.com/2013/12/19/configuration-of-sahi-automation-tool/ https://blogs.perficient.com/2013/12/19/configuration-of-sahi-automation-tool/#comments Thu, 19 Dec 2013 08:46:53 +0000 http://blogs.perficient.com/delivery/?p=2518

There are various Automation tools available in the market and open source tools have this upper hand over the licensed ones because there is always an extra investment needed to buy the commercial product. If the goal is achieved with an open source tool and is reliable then why to go for a licensed one? Here I am going to discuss about a new open source Automation tool “Sahi”.

Sahi“Sahi is a mature, business-ready tool for automation of web application testing. Sahi is available as an Open Source free product and as Sahi Pro, the commercial version. For testing teams in product companies and captive IT units which need rapid reliable web automation, Sahi would be the best choice among web automation tools.”

Sahi is mainly used to do cross browser compatibility testing. Cross browser compatibility testing is nothing but to make sure the application is working properly across various browser platforms. Let’s see how do we configure Sahi in our local system. Step 1 – “Download the Sahi Open Source tool”:- Download Sahi Open source tool source zipped file from the following location:- http://sourceforge.net/projects/sahi/files/sahi-v44/ Step 2 – “Extract the files in the local system”:- Once downloaded, extract it to a folder in your local system (say “D:\Sahi\”). Step 3 – “Configure the installation path on every browser”:- After extraction, we can see the default folder structure under “D:\Sahi” as shown below:-

Folder Structure

Folder Structure

Under <home_dir>/userdata/config, there is a file “browser_types.xml”. I need to edit this file mainly to set the correct <path> of browser installed in my system. As of now ignore the remaining fields. Since this is the default file, I can either comment out or delete the browser tags for the browsers not installed in my system as shown below:-

BrowserTypes_xml

BrowserTypes_xml

Step 4 – “Configure browser proxy settings on every browser”:- Since Sahi runs on proxy port 9999, I need to configure the following proxy host and port on all the browsers in which I need to execute the test. Http Proxy/Address as “localhost” Http Port as “9999” Change proxy settings in IEhttp://windows.microsoft.com/en-IN/windows-vista/Change-proxy-settings-in-Internet-Explorer Change proxy settings in Chrome http://www.googlechrometutorial.com/google-chrome-advanced-settings/Google-chrome-proxy-settings.html Change proxy settings in Firefox http://www.wikihow.com/Enter-Proxy-Settings-in-Firefox Note: – In case you have other browsers installed in your system, you can change the proxy settings for them as well. Step 5 – “Start the Sahi proxy server that listens to port “9999”:-  In the folder structure shown above, under <home_dir>/bin, there is a file “Sahi.bat”. I will be executing this file to start the server as shown below:-

Sahi Proxy start

Sahi Proxy start

Once the proxy server is started, you can see in the above snapshot that it reads the browsers from the browser_types.xml file and listens to port 9999. Note: – In case if you get any error after you execute Sahi.bat file, you need to debug if there is an incorrect path specified in any of the configuration files. Step 6 – “Open the browser and invoke Sahi Controller”:- After the proxy server is started, open the browser in which proxy host and port has been set and press “Ctrl+Alt+DblClick” to invoke Sahi Controller. Below is the snapshot of Sahi Controller:-

Sahi Controller

Sahi Controller

Once the Sahi Controller is opened, we can start doing record/play using the tool. I will be mentioning the working of Sahi in my next blog.

]]>
https://blogs.perficient.com/2013/12/19/configuration-of-sahi-automation-tool/feed/ 7 210642
Limitations of Automation Testing https://blogs.perficient.com/2013/11/13/limitations-of-automation/ https://blogs.perficient.com/2013/11/13/limitations-of-automation/#respond Wed, 13 Nov 2013 08:24:37 +0000 http://blogs.perficient.com/delivery/?p=2491

So far we have seen what Automation can do to help us in reducing human effort, time, cost etc.  Here I will discuss few scenarios for which either Automation can’t be done or is not required.

There are certain tasks which can be performed only using Automation tools such as Load, Endurance, and Scalable Performance testing which simulates 100 of users. However let’s see few of the tasks that cannot be Automated:-

–       Image Re-Captcha

=========================

Image Re-Captcha cannot be automated due to security measures being implemented in the application. This is nothing but an image which has distorted letters printed on it which can be identified only with the naked eye.  The existing Automation tools can’t read those distorted letters. There are few OCR (Optical Character Recognition) softwares available in market but are not 100% effective. Automation scripts won’t do that for you.

–       Adhoc Testing

=========================

“Ad hoc testing is a commonly used term for software testing performed without planning and documentation”. This type of testing is performed to learn more about the product by doing random testing. The main task of Adhoc testing is to find important defects much quickly. Automation scripts won’t do that for you.

–       One time manual test

=========================

There are few test cases for which one time manual test effort is enough to say whether the result will pass or fail. For example: – Selecting a radio button, once it’s selected there is no way it can get un-selected unless page is refreshed. Such test cases have minimum risk of failure and that doesn’t justify automating.

–       Improper Framework may cause overhead

=================================

In case of improper framework, chance of creating duplicate scripts gets increased. If there is lack of structured approach for framework then it increases redundancy and decreases maintainability.

–       Unidentified element properties

=================================

Most of the Automation tools work on JavaScript enabled browsers to manipulate the UI of an application. So to work upon any element present in a webpage, our script needs to identify the element from the properties defined by developers. In case there are html tags created that doesn’t have unique properties to identify any specific element, automation scripts will fail to perform action on it.

Though there are ways to identify those elements as well  but these will require some coding or use of external libraries which again results cost and time overhead.

 

]]>
https://blogs.perficient.com/2013/11/13/limitations-of-automation/feed/ 0 210637
Use of VBScript with Selenium to connect to database https://blogs.perficient.com/2013/10/08/use-of-vbscript-with-selenium-to-connect-to-database/ https://blogs.perficient.com/2013/10/08/use-of-vbscript-with-selenium-to-connect-to-database/#comments Tue, 08 Oct 2013 13:54:13 +0000 http://blogs.perficient.com/delivery/?p=2476

While most of the Automation scripts manipulate just the UI of an application, by making our scripts communicate with database we can accomplish more complicated tasks. Here is an example:-

Suppose you have to automate an application which works on online voting system where in once the vote is casted using the application, it goes to external system and verification process initiates to check whether the casted vote is correct or not. If the vote is correct then it will be counted else the count remains the same.

The following are the statuses of CASTED_VOTE column in VOTE_STATUS table configured in database:-

VOTE_STATUS

VOTE_STATUS

 

So usually our Automation script would be as follows:-

–       Record the steps to cast the vote

–       Wait till we receive the verification response from external entity.(Unknown time)

–       Go to the UI screen where total no. of votes are displayed and verify the count.

So the loophole in the above example would the unknown time (i.e. till what time script needs to wait before it executes the next command). This will create problem during execution when there are inputs provided in bulk. But if I can make my script communicate with the database, I can be sure of the unknown time as the time until I see the CASTED_VOTE_STATUS column equals 3 (i.e. Verification done).

So this will make our Automation script more realistic as shown below:-

–       Record and play the steps to cast the vote

–       Wait till we see CASTED_VOTE column status = 3

–       Go to the UI screen where total no. of votes are displayed and verify the count.

 

So let’s look at the task that needs to be performed:-

Objective: – Create an Automation script that communicates with database to provide more realistic Automation approach.

Resources: – Selenium, Java, VBScript

Solution: – I will be writing a VBScript code which will be invoked via Selenium to connect to DB.

To connect to DB using VBScript, I would require a connection string wherein I specify the machine details where my database is located. The connection string will vary according to the database server we are using.

The different types of Connection Strings can be found under the below link:-

http://www.connectionstrings.com

 

Below is the sample Connection.vbs file:-

======================================================

Dim connection

Dim RecordSet

Set connection=CreateObject(“adodb.connection”)

connection.ConnectionString=Driver={MySQL ODBC 5.1 Driver};Server=myServerAddress;Database=myDataBase;User=myUsername;Password=myPassword;Option=3;

connection.Open

Set RecordSet=CreateObject(“adodb.recordset”)

RecordSet.Open “<SQL Query to fetch or insert records from DB>”

<Operations to be performed>

Set RecordSet=nothing

Set connection=nothing

========================================================

So the entire Selenium script looks like as follows:-

/* Packages to import */

Public class DemoScript

{

// code to open the application tos caste the vote

// below is the code to invoke VBscript from Selenium script

Runtime.getRuntime().exec(“wscript <path of Connection.vbs file>”);

} // End of Java code

 

]]>
https://blogs.perficient.com/2013/10/08/use-of-vbscript-with-selenium-to-connect-to-database/feed/ 3 210636
Data driven testing using Selenium https://blogs.perficient.com/2013/06/23/data-driven-testing-using-selenium/ https://blogs.perficient.com/2013/06/23/data-driven-testing-using-selenium/#respond Mon, 24 Jun 2013 04:12:41 +0000 http://blogs.perficient.com/delivery/?p=2405

Let’s see what Data Driven testing is?

“It is an automation framework where test input and/or output values are read from data files. The different data files may include ODBC sources, csv files, Excel files, ADO objects, etc. The data is then loaded into variables in recorded or manually coded scripts.”

Now let’s take an example which shows why do we need data driven testing?

Suppose there is an online store, where in multiple items are displayed with their attributes. The customer who visits the store online, needs to enter the following information in order to add the product into the cart: –

–       Name

–       Product

–       Quantity

–       Date of Purchase

–       Street

–       City

Let’s look at Automation (Junit) pseudo code to automate the above mentioned process: –

/* Packages to import */

Public class DemoScript

{

// code to open the application that needs to be automated.

selenium.type (“Name”, “John”);

selenium.type (“Product”, “FamilyAlbum”);

selenium.type (“Quantity”, “5”);

selenium.type (“Date of Purchase”, “20/04/2008”);

selenium.type (“Street”, “13 park street”);

selenium.type (“City”, “Sydney, Australia”);

selenium.click(“Submit”);

}

The above code will input all the mentioned values into the required fields in the web page.

So In order to submit a request for 1 customer we have written 7 lines of code. Now say suppose there are 100 customers willing to buy 100 different products. We need to write 700 lines of code, which will make our script quite uneasy to maintain. Hence we can use the Data Driven approach as follows:-

We can write only those mandatory 7 lines of code required to complete one transaction and make that piece of code to repeat 100 times every time taking a new customer and choosing the desired product. Let’s look at the task that needs to be performed: –

Objective: – Enter requests of 100 customers with different products as per the requirement by following the Data Driven approach.

Resources: – Selenium, Apache POI library.

Solution: – First of all we need to set up testdata in an external Excel file (TestBook.xls) as shown below. (Sample screenshot shown is for 10 data, we can configure that to 100).

TextBook.xls

TextBook.xls

We can ask our Selenium script to communicate with the external file and take input data from the same. We have an existing library called Apache POI. This is a powerful library which can perform read/write operations on Excel file.

You can download the required jar file in the below location: –

http://poi.apache.org/download.html

Below is the complete code to perform the task: –

 

package Demo;

import com.thoughtworks.selenium.*;

import org.junit.After;

import org.junit.Before;

import org.junit.Test;

import java.io.FileInputStream;

import org.apache.poi.hssf.usermodel.HSSFCell;

import org.apache.poi.hssf.usermodel.HSSFRow;

import org.apache.poi.hssf.usermodel.HSSFSheet;

import org.apache.poi.hssf.usermodel.HSSFWorkbook;

 

public class ExcelInt {

private Selenium selenium;

HSSFWorkbook workbook;

@Before

public void setUp() throws Exception {

selenium = new DefaultSelenium(“localhost”, 4444, “*chrome”, “<URL of the application>”);

selenium.start();

}

@Test

public void testExcelInt() throws Exception {

selenium.windowMaximize();

selenium.open(“/”);

//path of your Excel file located in the system

FileInputStream fileInputStream = new FileInputStream(“D:\\Demo1.xls”);

workbook = new HSSFWorkbook(fileInputStream);

HSSFSheet worksheet = workbook.getSheet(“Sheet1”);

//The below code will execute till it identifies the last row in Excel containing data

for (int i=0;i<=worksheet.getLastRowNum();i++)

{

HSSFRow row1 = worksheet.getRow(i);

HSSFCell cellA1 = row1.getCell((short) 0);

String a1Val = cellA1.getStringCellValue();

selenium.type (“Name”, row1.getCell((short)0).getStringCellValue());

selenium.type (“Product”, row1.getCell((short)1).getStringCellValue());

selenium.type (“Quantity”, row1.getCell((short)2).getStringCellValue());

selenium.type (“Date of Purchase”, row1.getCell((short)3).getStringCellValue());

selenium.type (“Street”, row1.getCell((short)4).getStringCellValue());

selenium.type (“City”, row1.getCell((short)5).getStringCellValue());

selenium.click(“Submit”);

}

}

}

@After

public void tearDown() throws Exception {

selenium.stop();

}

}

//The above code will take 1st row in particular from the data sheet then take all values from all the columns (cells) of 1st row. Then moving on to 2nd row and perform the same operation. This will go on till it finds a blank value in Excel data row (i.e. End of file).

Using Apache POI library we can create our customized report as well. Like we have asked our java code to read data from Excel file, also we can insert data into the same Excel sheet.

]]>
https://blogs.perficient.com/2013/06/23/data-driven-testing-using-selenium/feed/ 0 210628