Types of Software Testing

Types of Software Testing

Dry run Testing In this type of testing the effects of a possible failure are intentionally mitigated. Usually done in a different server with Customer data before moving into actual Production release. Mutation Testing This type of testing checks whether our unit tests are robust enough. The mutation is a small change in code; where we deliberately alter a program’s code and then re-run our valid unit test suite against the mutated program. A good unit test will detect the change in the program and fail accordingly. Incremental Testing Partial testing of an incomplete product. Usually done to provide early feedback to the developers. Bucket Testing (A/B testing) A/B testing compares the effectiveness of two versions of a webpage, marketing email, in order to discover which has a better response rate or better sales conversion rate. Soak Testing Involves testing a system with a significant load extended over a significant period of time to discover how the system behaves under sustained use. Sandbox Testing It is a testing environment that isolates untested code changes and outright experimentation from the production environment somewhat like a working directory/ test server/ development server in which the developers “check out” a copy of the source code tree or a branch to examine and work on. Only after the developer has fully tested the code changes in their own sandbox should the changes he check back into and merge with the repository and thereby be made available to other developers or end-user of the software. Have questions? Contact the software testing experts at InApp to learn more.

Cross-Site Scripting (XSS)

Cross-Site Scripting (XSS)

What is Cross-Site Scripting? Cross-site scripting, also known as XSS, is a type of security vulnerability typically found in Web applications. It occurs when a web application gathers malicious data from a user. The data is usually gathered in the form of a hyperlink that contains malicious content. Browsers are capable of displaying HTML content and executing JavaScript. If the application does not escape special characters in the input/output and sends the user input back to the browser, an attacker may be able to launch an XSS attack successfully. Through which malicious files can be executed, session details of a logged-in user can be stolen, or Trojans can be installed. Types of XSS: The non-persistent (or reflected) Cross-site scripting vulnerability is the most common type. A non-persistent XSS vulnerability occurs when the data provided by the attacker is immediately executed and a generated page is returned to that user. The persistent (or stored) XSS vulnerability occurs when the data provided by the attacker is saved in the server, and permanently displayed on web pages returned to other users. Another type of XSS attack is DOM Based on XSS. DOM Based XSS (type-0 XSS) is an attack wherein the attack payload is executed as a result of modifying the DOM environment in the victim’s browser. How to Perform XSS Testing: Submitting malicious script through text inputs List out all the text input fields [Text box, Text area] in the application. Submit simple javascript code, like ‘<script>alert(“XSS”)</script> through each identified text input field. If the text box is vulnerable, an alert with the text mentioned in the quotes will be returned.       Submitting malicious script through an application URL Modifying the requests using security testing tools like Burp Suite to test for application vulnerability Capture the request using the Burp tool Append malicious script in the captured request ‘Forward’ the modified URL Validate the result How to prevent XSS attacks are possible mainly because the server is not handling special characters in the output. There are 2 broad strategies for defeating XSS: Whitelisting Good inputs Whitelist: Create a whitelist of characters required by the application. Once the whitelist is ready, the application should disallow all requests containing any character apart from those in the list. Blacklisting Bad input Blacklist: The application should not accept any script, special character, or HTML in fields whenever not required. It should escape special characteristics that may prove harmful. Some of the special characters used in the script that must be escaped are <>()[]{}/\*;:=%+^! Have questions? Contact the software testing experts at InApp to learn more.

Creating AdvancedTest Plan in JMeter

Creating AdvancedTest Plan in JMeter

The need for creating an Advanced Test Plan comes in when the test requires any of the following The need to validate results based on updates to a field in the DB To use Input File in order to parameterize the input variable Use of While, If-Else controller Steps to be followed while recording an advanced script Open a new Test Plan Right-click on Test Plan->Add->Threads (users)->Thread Group Right-click on Thread Group->Add->Config Element->CSV Data Set Config Right-click on Thread Group->Add->Config Element->Http Cookie Manager Right-click on Thread Group->Add->Config Element->Http Header Manager Right-click on Thread Group->Add->Logic Controller->Transaction Controller Right-click on Thread Group->Add->Logic Controller->Recording Controller Right-click on WorkBench->Add->Non-Test Elements->Http Proxy Server Click on HTTP Proxy Server, and from the Target Controller Drop down select ‘Transaction Controller->Recording Controller’ To exclude images, Add rows in the ‘URL Patterns to Exclude’ and use the code ”.*\jpg” , “.*\gif” , “.*\png” Right-click on Recording Controller->Add->Config Element->JDBC Connection Configuration. This is used to create a Database connection. We include Database URL, JDBC Driver Class, Username, and Password Right-click on Recording Controller->Add-> Config Element->User Defined Variables. This is used to replace Hard-coded values say Username, Syntax is ‘Variable name = Variable value’ and use ${variable name} in the script instead of hard-coded value(s). Right click->Recording Controller->Add->Logic Controller->While Controller Right-click on While Controller->Add->Sampler ->JDBC Request. This sampler is used to send SQL Query to the Database. Note: Before using JDBC Request Sampler we need to set up the ‘JDBC Connection Configuration’ configuration element Right-click on While Controller->Add->Sampler->Debug Sampler. It generates the sample of all values of JMeter variables Similarly, we can generate a sample of all values of JMeter properties and system properties. Presently in our scripts, we set these 2 properties to false. Right-click on While Controller->Add->Timer->Constant Timer This timer pauses for the set amount of time between requests. If we don’t add a delay, JMeter could overwhelm the server by making too many requests in a short amount of time. Please note Before recording go to Internet Explorer->Tools->Internet options->Connection->LAN settings Check the Proxy server, address: localhost; port: 8080, click ok Click on Start to record the script Once the recording is over, click Stop and save it as a JMX file Using Transaction Controllers in the Test Plan Grouping the test action within a Transaction Controller helps the normal user to understand the script more effectively than recording an entire script in one full stretch. To add a Transaction Controller, right-click on Thread Group->Add->Logic Controller->Transaction Controller Transaction Controller measures the overall time taken to perform nested test elements Once the entire script is executed, click on View Results Tree and select Transaction Controller, this controller gives the overall time taken to process the request (load time). The load time will is shown in milliseconds. For n multi-users, once the scripts are executed there will be ‘n’ Transaction controllers within the View Results Tree. Each Transaction controller when clicked shows 3 tabs – Sampler, Request, and Response. By default, the sampler tab is shown. Assume a user logins to an online shopping website and performs a search on 5 different products. While the script is under execution, the progress of the script parsing through 5 different products is reflected in the statement – “Search Transaction Controller 1-4”. This means of the 5 searches, 4th has been completed. Merging Scripts Two or more scripts can be merged into a single Test Plan. Assume we have three scripts merged in the order say Create Profile (This script is executed by 10 users) Basic Search (This script is executed by another 10 users) Signing up on an online shopping website (This script is executed by another 5 users) Steps to be followed while merging the scripts Open any existing test plan say ‘CreateProfile.jmx’ Right-click on Test Plan-> Merge Select the test plan which you want to merge say ‘BasicSearch.jmx’ A new thread group is displayed along with the existing thread group In order to avoid confusion, it would be better to rename the thread group(s) merged. When the merged script is executed, there will be 25 (10+10+5 users) Transaction controllers now present in the Test Plan. During execution, each of the 3 merged scripts will be tracked independently, i.e if the Transaction Controller shows ‘Basic search Transaction Controller 2-6’; this implies the 6th search of the 2nd merged script has just completed execution. Input File (Comma delimited (CSV) file) The recorded scripts can be executed with multiple users, by parameterizing our scripts. This can be done in two ways: User-defined variables Input file CSV file is very useful while executing the JMeter script(s) with ‘n’ multiple users. The attached screenshot (CSV.jpg) has 3 columns. The first column is for Username, the second column is for password and the third column is for the server name. On executing the script after parameterizing, the script fetches the value from the CSV file, substitutes it with the corresponding request, and sends it to the server. Points to be noted while using a CSV file Open an Excel file and provide the required information say Username, Password, Server name, etc under each column that we would like to pass as an input parameter to the script. Save the Excel file as .csv In the Test plan, configure CSV Data Set Config as follows, click on CSV Data Set ConfigCSV Data Set Config as follows, click on CSV Data Set Config Filename: filename.csv File encoding: leave it blank Variable Names (comma-delimited): Specify the names for parameters (values) specified in each column in the CSV file. Later use this variable name in the script as ‘${variablename}’ Delimiter (use \t for tab): , Allow quoted data?: False Recycle on EOF?: True Stop thread on EOF?: False Sharing mode: All threads Note: The first variable name entered in the Variable Names text box contains the value of the first column of the CSV file and so on. Replace values with corresponding variable names in the form ‘${variable name}’ throughout the script. It would be even better to rename the

Creating Basic Test Plan in JMeter

Creating Basic Test Plan in JMeter

How to Create a Basic Test Plan? Steps to be followed while recording a script: Open a new Test Plan Right click on Test Plan->Add->Threads (users) ->Thread Group Right-click on Thread Group->Add->Config Element-> HTTP Cookie Manager Right-click on Thread Group->Add->Config Element-> HTTP Header Manager Right-click on Thread Group->Add->Config Element->HTTP Request Defaults Right-click on Thread Group->Add->Logic Controller->Recording Controller Right-click on Workbench->Add->Non-Test Elements-> HTTP Proxy Server* Click on HTTP Proxy Server, and from the Target Controller Drop down select ‘Thread Group>Recording Controller’ ** Click on Start to record the script Once the recording is over, click Stop and save as a “.jmx” file *Before recording go to Internet Explorer->Tools->Internet options->Connection->LAN settings. Check the Proxy server, address: localhost; port: 8080, and click ok. **To exclude images, Add rows in the ‘URL Patterns to Exclude’ and use the code ”.*\jpg” , “.*\gif” , “.*\png” To set the number of users: Click on Thread Group Set the Number of Threads (users) Set Ramp-up period (Time is in seconds). It sets the number of users that are initiated every second. To add appropriate test results (Listeners) to the Test Plan Right-click on Thread Group->Add->Listener->View Results Tree. This report will give details of – Sampler (HTML page), Request info, and Response info. Have questions? Contact the software testing experts at InApp to learn more.

Test Automation Frameworks

Test Automation Frameworks

A framework is a set of assumptions, concepts & practices that support automation. Types of Frameworks Test script Modularity Framework Test Library Architecture Framework Keyword driven Framework Data Driven Framework Hybrid Frameworks Test Script Modularity Framework Test script modularity framework requires the creation of small independent scripts that represent modules section & functions of the AUT. Library Architecture Framework It divides AUT into procedures & functions. Creation of Library files that represent Modules, functions, and sections of AUT. Keyword-driven Framework This framework requires the development of data tables and keywords, independent of the test automation tool used to execute them and the test script code that “drives” the application-under-test and the data. Keyword-driven tests look very similar to manual test cases. In a keyword-driven test, the functionality of the application-under-test is documented in a table as well as in step-by-step instructions for each test. Data-Driven Framework Input & Output values are read from data files. Hybrid Framework Have questions? Contact the software testing experts at InApp to learn more.

Basics Of Messaging Platform

Basics Of Messaging Platform

There are different types of messages that can be sent using a messaging platform. Some of these are: Text Message Multimedia Message WAP Message Service Messages Here we can take a deep look into the text messages. There are basically 3 types of text messages: UTF -16 Encoded (16-bit Unicode Transformation Format) UTF-8 Encoded(8-bit Unicode Transformation Format) Flash UTF – 16 Normal English characters come under this category and special characters like semicolons, full stop, etc… are supported. Messages will be of the length of 160 characters and if it goes beyond the 160 character limit it will split into 2 at the 154th character and then get concatenated at the mobile device. These are class 1. UTF – 8 The purpose of this encoding is to support the international characters; the languages like French, Spanish, Arabic, Hindi, Malayalam, etc… are supported. For UTF-8 encoded messages the length will be 60 characters. Concatenation and slicing take place if the character count goes beyond that limit. Flash Message These are normal messages of the length of 160 characters in English and 60 characters in other languages like French, Spanish, etc… only difference is that these messages will not get saved to the phone memory. The class is set to 0 for generating flash messages. The figure displays the transaction process that takes place between the SMPP and SMSC. SMPP sends a bind request to the SMSC and SMSC will respond to the request. If the bind was successful SMPP sends a Submit SM to the SMSC and will receive a successful response if the submits were good. Then the SMSC identifies the originator, destination, sender, text message, etc, and forwards it to the destination address. On receiving a successful delivery from a mobile device SMSC will forward the same to SMPP and SMPP will send a response. After the delivery response has been sent to the SMSC, SMPP sends an unbind request to SMSC on successful reception of the unbind request the SMSC will unbind by sending an unbind response. The above diagram explains the general internal architecture of a messaging application and its routing system. The system consists of: Messaging application at user end for pushing bulk messages Input queues, operator queues, response queues, etc… Data Base to store the messages, responses, and delivery details Operator to which messages are pushed Finally, the mobile device to which the messages are delivered The messaging application at the end-user side pushes bulk messages into the input queues. From the input queues, the messages will get a parallel push to the database and routing application. The routing application will be responsible for identifying the exact route for each message. Once the route is identified that gets updated in the database. As per the route identified, the message falls into the operator queue. From the operator, queue messages are pushed to operators and when the operator receives a successful message it will push a response to the response queue. After pushing the response the operators will send a message to the mobile device. The mobile devices will respond to the reception of messages to operators and the operator will push this delivery response to the delivery response queue and all the statuses will get updated in the database. Have questions? Contact the technology experts at InApp to learn more.

Everything about Performance Testing

Everything about Performance Testing

What is Performance Testing? Performance testing of an application is basically the process of understanding how the web application and its operating environment respond at various user load levels. In general, we want to measure the latency, throughput, and utilization of the website while simulating attempts by virtual users to simultaneously access the site. One of the main objectives of performance testing is to maintain a website with low latency, high throughput, and low utilization. The performance test measures how well the application meets the customer expectations in terms of, Speed – determines if the application responds quickly Scalability – determines how much user load the application can handle Stability – determines if the application is stable under varying loads Why Performance Testing? Performance problems are usually the result of contention for, or exhaustion of, some system resource. When a system resource is exhausted, the system is unable to scale to higher levels of performance. Maintaining optimum Web application performance is a top priority for application developers and administrators. Performance analysis is also carried out for various purposes such as: During a design or redesign of a module or a part of the system, more than one alternative presents itself. In such cases, the evaluation of a design alternative is the prime mover for an analysis. Post-deployment realities create a need for tuning the existing system. A systematic approach like performance analysis is essential to extract maximum benefit from an existing system. Identification of bottlenecks in a system is more of an effort at troubleshooting. This helps to replace and focus efforts on improving overall system response. As the user base grows, the cost of failure becomes increasingly unbearable. To increase confidence and to provide an advance warning of potential problems in case of load conditions, the analysis must be done to forecast performance under load. Typically to debug applications, developers would execute their applications using different execution streams (i.e., completely exercise the application) in an attempt to find errors. When looking for errors in the application, performance is a secondary issue to features; however, it is still an issue. Objectives of Performance Testing End-to-end transaction response time measurements. Measure the Application Server component’s performance under various loads. Measure database components’ performance under various loads. Monitor system resources under various loads. Measure the network delay between the server and the clients Performance Testing Approach Identify the Test Environment Identify the physical test environment and the production environment as well as the tools and resources available to the test team. The physical environment includes hardware, software, and network configurations. Having a thorough understanding of the entire test environment at the outset enables more efficient test design and planning and helps you identify testing challenges early in the project. In some situations, this process must be revisited periodically throughout the project’s life cycle. Identify Performance Acceptance Criteria Identify the response time, throughput, and resource utilization goals and constraints. In general, response time is a user concern, throughput is a business concern, and resource utilization is a system concern. Additionally, identify project success criteria that may not be captured by those goals and constraints; for example, using performance tests to evaluate what combination of configuration settings will result in the most desirable performance characteristics. Plan and Design Tests Identify key scenarios, determine variability among representative users and how to simulate that variability, define test data, and establish metrics to be collected. Consolidate this information into one or more models of system usage to be implemented, executed, and analyzed. Configure the Test Environment Prepare the test environment, tools, and resources necessary to execute each strategy as features and components become available for test. Ensure that the test environment is instrumented for resource monitoring as necessary. Implement the Test Design Develop the performance tests in accordance with the test design. Execute the Test Run and monitor your tests. Validate the tests, test data, and results collection. Execute validated tests for analysis while monitoring the test and the test environment. Analyze Results, Report, and Retest Consolidate and share results data. Analyze the data both individually and as a cross-functional team. Reprioritize the remaining tests and re-execute them as needed. When all of the metric values are within accepted limits, none of the set thresholds have been violated, and all of the desired information has been collected, you have finished testing that particular scenario on that particular configuration. Functions of a Typical Tool Record & Replay: Record the application workflow and playback the script to verify the recording. Execute: Run the fully developed Test Script for a stipulated number of Virtual users to generate load on the AUT (Application Under Test) The dashboard displays the values for the desired parameters Remote connects to the app/web servers (Linux/Windows); gathers resource utilization data Analyze: Generates the report; helps to analyze the results and troubleshoot the issues. Attributes Considered for Performance Testing The following are the only few attributes out of many that were considered during performance testing: Throughput Response Time Time {Session time, reboot time, printing time, transaction time, task execution time} Hits per second, Request per second, Transaction per seconds Performance measurement with a number of users. Performance measurement with other interacting applications or task CPU usage Memory usage {Memory leakages, thread leakage} All queues and IO waits Bottlenecks {Memory, cache, process, processor, disk, and network} Highly Iterative Loops in the Code Data not optimally aligned in Memory Poor structuring of Joins in SQL queries Too many static variables Indexes on the Wrong Columns; Inappropriate combination of columns in Composite Indexes Network Usage {Bytes, packets, segments, frames received and sent per sec, Bytes Total/sec, Current Bandwidth Connection Failures, Connections Active, failures at network interface lever and protocol level} Database Problem {Settings and configuration, Usage, Read/sec, Write/sec, any locking, queries, compilation error} Web server {request and response per second, services succeeded and failed, serve problem if any} Screen transition Throughput and Response time with different user loads CPU and Memory Usage with different user loads Have questions? Contact the

Working with Regular Expression Extractor in JMeter

Working with JMeter Regular Expression Extractor

Using Regular Expression Extractor in JMeter During automating tests many times, the test scripts depend on input values that are generated during the test run. These values can be stored in a variable but sometimes the test requires only a part of this value. In such cases, the need for a string extractor is felt. Regular Expression Extractor serves this purpose by pulling out the required values that match the pattern. [ ] Matches anything within the square bracket – Dash inside a square bracket specifies the range e.g [ 0-9] means all digits from 0 to 9 ^ Negates the expression e.g [^ a-z] means everything except lowercase a to z $ Checks for the match at the end of a target string More are listed below. While scripting with JMeter, a Regular expression extractor is used to retrieve the values from the server response. This value is passed as a parameter to the While controller & IF controller. It can also be used to replace any pre-defined variable. The regular expression used is a Perl-type regular expression. Working with JMeter To add a regular expression extractor element to a test plan in JMeter: Right-click the sampler element (request to the server from which the value needs to be extracted) Select Add option -> Post Processors -> Regular expression extractor An explanation of the Regular expression extractor element is detailed in the link http://jmeter.apache.org/usermanual/component_reference.html#Regular_Expression_Extractor How to extract Single or multiple strings using the Regular Expression Extractor element Extracting a Single string from the response Consider an example, where a user successfully logins into an online shopping website and is navigated to the user’s home page where the name ‘Welcome Username’ is displayed. In order to extract the username, the below R.E can be used: Reference Name: Username Regular Expression: Welcome (.+?) Template: $1$ Match No. (0 for Random): 1 Default Value: match not found Note: The special characters above mean: ( ) encloses the portion of the match string to be returned . -> match for any character. + -> one or more times. ? stop when the first match is found Without the ? the .+ would continue until it finds the last possible match. Extracting Multiple Strings from the response Consider a scenario where the user selects an item; it has a product id and a category id. To extract both the ids, the below R.E can be used Reference Name: My_ID Regular Expression: Product_ID = (.+?)\&Category_ID = (.+?) Template: $1$$2$ Match No. (0 for Random): 1 Default Value: match not found Since we need to extract two values from the response, two groups are created. So the template has $1$$2$. The JMeter Regex Extractor saves the values of the groups in additional variables. The following variables would be set as: My_ID -> PR_001CAT_001 My_ID _g0 -> Product_ID =” PR_001″ Category_ID =” CAT_001″ My_ID _g1 -> PR_001 My_ID _g2 -> CAT_001 These variables can be later referred to in the JMeter test plan, as ${MY_ID_g1}, ${MYREF_g2}. Extracting only numbers from the String Consider a case where we need to only extract the numbers, for example, the product id says PR_001. To extract 001, the below R.E can be used Reference Name: ProductID Regular Expression: Product_ID = “PR_(.+?)” Template: $1$ Match No. (0 for Random): 1 Default Value: match not found Have questions? Contact the software testing experts at InApp to learn more.

Automation Index Formula – A checklist to help in identifying the tests that are feasible to automate

Automation Index

“Just because a test is automatable it does not mean it should be automated” Elfriede Dustin Automation testing begins with an analysis of what is feasible to automate, taking into account the budget, resources, schedule, and available expertise. Given limited resources and tight deadlines, we first need to prioritize what is to be automated. The effort required can be measured with the help of the Automation Index. Automation Index Formula The automation Index is the ratio of the number of test cases that are feasible to be automated against the total number of test cases. AI = TFA / TC where AI = Automation Index TFA = Tests feasible to be automated TC = Total number of Test Cases A checklist to help in identifying the tests that are feasible to automate: Tests that are yes for the above are good candidates for automation. Factors that are to be considered in addition to the above are: Have questions? Contact the software testing experts at InApp to learn more.

Mobile Application Testing

Mobile Application Testing

Introduction: Handheld devices are evolving and becoming increasingly complex with the continuous addition of features and functionalities. Testing is challenging in the handheld, wireless world because problems are new, or they show up in new ways. This paper is aimed to highlight certain crucial areas the tester needs to concentrate on while testing mobile applications. The 4 main areas to consider: Understanding the behavior of the device UI & Usability Testing. External Constraints Stress Testing Understanding the behavior of the device If you are new to a device the first thing you should do is to get familiar with how the common device functionalities work – such as its Phone, Camera, Contacts, Calendar, Program, etc. Things to note while exploring inbuilt applications: Overall color scheme/theme of the device. Style and color of icons Progress indicators when pages are loading. Menus – How they are invoked and typical items they contain. Overall responsiveness of applications on the device. UI & Usability Testing: The unique features of mobile devices pose a number of significant challenges for examining the usability of mobile applications, including screen orientation, multi-modality, small screen size, different display resolutions, soft keyboards, and touch screens. Screen Resolution If your application is supported on various devices that have different screen resolutions, make sure you test with the device that has the smallest screen and the application still looks good on larger screen sizes as well. Screen orientation (Landscape/Portrait modes) If your device supports screen orientation changes be sure to include lots of testing where you rotate the device from portrait to landscape display, and vice versa, on all of the pages within your application. It is also important to test input reactions when the screen orientation is changed. Try using the soft keyboard while changing the orientation repeatedly. Attempt this repeatedly and quickly to see if the rapid changes in orientation have a negative effect on the application. Touch Screens: Make sure that the application supports multi-touch (eg: pinch, two-finger tap, two-finger scroll, spread, two-hand spread, etc), single touch – eg: tap, double tap, scroll, etc, touch based on the requirement. The application should be tested for long touch and soft touch behavior. Soft keyboards – Points to consider Does the soft keyboard appears automatically Does the first layer of the soft keyboard include shortcuts related to highlights? Does a long touch on a soft character key bring up several different character choices? Can the soft keyboard be dismissed and re-displayed easily Can the soft and hard keyboards be used interchangeably (if the device has both) Do soft keyboard characters entered in password fields only show up as **** Multi-modality: Multi-modality combines voice and touch (via a keypad or stylus) as input with relevant spoken output (e.g., users are able to hear synthesized, prerecorded streaming or live instructions, sounds, and music on their mobile devices) and onscreen visual displays in order to enhance the mobile user experience and expand network operator service offerings. Make sure that the application supports the functionality based on the requirement. External Factors Affecting Mobile Application Testing Network Connections: App going to be used on devices in various locations with various network connection speeds, it is important to plan testing coverage for the following scenarios: Only Wi-Fi connection Only a 3G/2G connection With no SIM card in the device In Airplane mode (or all connections disabled) Using the network through a USB connection to a PC Test intermittent network scenarios that a user might encounter in the real world: Phone calls: The tester has to check the application behavior during incoming and outgoing calls. Make sure that the application works fine during the following phone calls. The application is interrupted by an incoming call, originator hangs up the call The application is interrupted by an incoming call, terminator hangs up the call The application is interrupted by placing an outgoing call, originator hangs up the call The application is interrupted by placing an outgoing call, terminator hangs up the call. Other Interruptions: The tester has to consider the below interrupts that could have an impact on the functionality or overall responsiveness of your application. Text messages Voicemail notifications Calendar events Social media notifications (Facebook, Twitter, etc) Alarm clocks Low battery notifications Device Settings Explore your device’s options, and change settings such as the following to see how they affect your application: Sound profiles – Does your application respect the device’s sound settings? Device password/unlock pattern – Does your application still install correctly when prompted for a password/unlock pattern? Font – How does choosing a different font family, size, or style affect the appearance and usability of your application? Screen time out/Auto on, off- Is your application subject to screen dimming or automatically turning off even when it is actually busy? Screen orientation – Does your application respect this setting? Connections – How does enabling/disabling Bluetooth or other connection types affect your application’s behavior? Stress Testing Certain mobile applications consume more memory and CPU than desktop applications. Stress testing is a must to identify exceptions, situations with the application hang, and deadlocks that may go unnoticed during functional and user interface testing. Note the behavior of the application while testing with the following scenarios: Load your application with as much data as possible in an attempt to reach its breaking point. Perform the same operations over and over again, particularly those that load large amounts of data repeatedly. Perform the repeated operations at varying speeds – very quickly or very slowly. Leave your application running for a long period of time, both interacting with the device and just letting it sit idle, or perform some automatic task that takes a long time. Test multiple applications running on your device so you can switch between your application and other applications. After testing several functionality switched off and switch on the device. Have questions? Contact the software testing experts at InApp to learn more.

InApp India Office

121 Nila, Technopark Campus
Trivandrum, Kerala 695581
+91 (471) 277 -1800
mktg@inapp.com

InApp USA Office

999 Commercial St. Ste 210 Palo Alto, CA 94303
+1 (650) 283-7833
mktg@inapp.com

InApp Japan Office

6-12 Misuzugaoka, Aoba-ku
Yokohama,225-0016
+81-45-978-0788
mktg@inapp.com
Terms Of Use
© 2000-2026 InApp, All Rights Reserved