JQuery – Use The “On()” Method Instead Of “Live()”

JQuery – Use The “On()” Method Instead Of “Live()”

As of jQuery 1.7, the .live() method is deprecated.Use.on() to attach event handlers. Description: Attach an event handler function for one or more events to the selected elements. The .on() method attaches event handlers to the currently selected set of elements in the jQuery object. Syntax:- .on(events[,selector][,data],handler(eventObj)) .on(eventType, selector, function) Examples:- $(“body”).on(“click”, “#element”, function(){ $(“#my”).html(result); }); $(“body”).on(“click” , “p” ,function(){ alert($(this).text()); }); Migration from .live() to .on() before: $(‘#mainmenu a’).live(‘click’, function) after, you move the child element (a) to the .on() selector: $(‘#mainmenu’).on(‘click’, ‘a’, function) Have questions? Contact the technology experts at InApp to learn more.

What is Apache Airavata? Its Features and Architecture

What is Apache Airavata? Its Features and Architecture

What is the meaning of Airavata? Airavata is a mythological white elephant that carries the Hindu god Indra. It is also called ‘abhra-Matanga’, meaning “Elephant of the Clouds” (http://en.wikipedia.org/wiki/Airavata) What is Apache Airavata? Apache Airavata is a software framework for executing and managing computational jobs and workflows on distributed computing resources including local clusters, supercomputers, national grids, and academic and commercial clouds. Airavata has the capability of composing, managing, executing, and monitoring a variety of distributed applications and workflows that runs on computational resources. Concepts of service-oriented computing, distributed messaging, workflow composition, and orchestration provide the foundation for Airavata. You can visit the official website of Apace Airavata at https://airavata.apache.org/ What are the Features of Apache Airavata? Desktop tools and browser-based web interface components for managing applications, workflows, and generated data. Sophisticated server-side tools for registering and managing scientific applications on computational resources. Graphical user interfaces to construct, execute, control, manage, and reuse scientific workflows. Interfacing and interoperability with various external (third party) data, workflow, and provenance management tools. Airavata Architecture The architecture is designed to be a modular, componentized software framework as illustrated in the following Figure. The goal of the Airavata framework is minimalist architectural design (i.e., a thin layer), a conceptually simple to understand architecture; and easier to install, maintain and use. Have questions? Contact the technology experts at InApp to learn more.

Testing Web Services using ApacheBench

Testing Web Services using ApacheBench

ApacheBench (ab) is a tool for benchmarking an Apache Hypertext Transfer Protocol (HTTP) server. This shows how many requests per second the server is capable of handling. A point to note is that ApacheBench will only use one operating system thread regardless of the concurrency level; specified by the -c parameter. In some cases, especially when benchmarking high-capacity servers, a single instance of ApacheBench can itself be a bottleneck. To overcome this, additional instances of ApacheBench may be used in parallel to more fully saturate the target URL. ApacheBench was recently used to test the capability of the Caleum server, to find the threshold of the total number of web requests it can concurrently serve, in its current configuration. Working with ApacheBench Installing on a Windows machine Download the software from the link http://www.apache.org/dist/httpd/binaries/win32/ by selecting any mirrors on the site. Select the latest version of the software or later Double-click and install the software. While installing provide the information Network Domain: localhost Server Name: localhost Admin Email: provide a real or fake email Leave all default checkboxes checked After installation, an icon will be displayed in the system tray. This means Apache2.2 has been installed and started. To verify further type http://localhost/ in the browser. If Apache 2.2 has been started the message “It works!” in bold will be loaded in the browser. To stop/restart the server click on the icon ->Apache 2.2->Stop/Restart. To measure the performance of a server you may need to point your files to Apache. Since we are doing a web service testing this step is optional. Execution: Open the command prompt and go to the path where ApacheBench is installed say “C:\Program Files\Apache Software Foundation\Apache2.2\bin” Type ab –n 100 –c 10 http://{webserver hostname:port}/{document path} You can also provide the authentication details as the parameters in the document path. Other options that can be used are Options are: -n requests Number of requests to perform -t timelimit Seconds to max. wait for responses -v verbosity How much troubleshooting info to print -b windowsize Size of TCP send/receive buffer, in bytes -C attribute Add cookie, eg. ‘Apache=1234. (repeatable) -H attribute Add Arbitrary header line, eg. ‘Accept-Encoding: gzip’ Inserted after all normal header lines. (repeatable) -A attribute Add Basic WWW Authentication, the attributes are a colon-separated username and password. -P attribute Add Basic Proxy Authentication, the attributes are a colon-separated username and password. -x attributes String to insert as table attributes -y attributes String to insert as tr attributes -z attributes String to insert as td or th attributes -Z ciphersuite Specify SSL/TLS cipher suite (See openssl ciphers) -c concurrency Number of multiple requests to make -T content-type Content-type header for POSTing, eg. ‘application/x-www-form-urlencoded’. The default is ‘text/plain’ -g filename Output collected data to gnuplot format file. -e filename Output CSV file with percentages served -p postfile The file containing data to POST. Remember also to set -T -f protocol Specify SSL/TLS protocol (SSL2, SSL3, TLS1, or ALL) -X proxy:port Proxy server and port number to use -i Use HEAD instead of GET -V Print the version number and exit -k Use the HTTP KeepAlive feature -d Do not show the percentiles served table. -S Do not show confidence in estimators and warnings. -r Don’t exit on socket receive errors. -h Display usage information (this message) -w Print out results in HTML tables Output as below is displayed in the cmd prompt after the execution Concurrency Level: 10 Time taken for tests: 321.212 sec Complete requests: 1000 Failed requests: 11 (Connect: 0, Receive: 0, Length: 11, Exceptions: 0) Write errors: 0 Document length 21bytes Total transferred: 22124 bytes HTML transferred: 11994 bytes Requests per second: 1.01 [#/sec] (mean) Time per request: 1216.319 [ms] (mean) Time per request: 156.272 [ms] (mean, across all concurrent requests) Transfer rate: 1.81 [Kbytes/sec] received 1.61 kb/s sent 0.42 kb/s total Connection Times (ms) min mean [+/-sd] median max Connect: 200 200 121 212 3000 Processing: 301 2121 612.8 1921 3267 Waiting: 211 2112 21 121 1211 Total: 711 3546 799.3 3281 6547 Percentage of the requests served within a certain time (ms) 50% 1212 66% 3823 75% 2211 80% 4555 90% 5555 95% 6666 98% 7777 99% 8888 100% 8899 (longest request) It shows the total time to complete the entire test and the number of completed requests and failed requests. If there is any fail an additional line will be displayed. Connect:, Receive:, Length:, Exceptions: While testing the web server, we mainly focus on the fails in Connect and Receive. The failure in the length is due to the content length not being specified or some additional data like ads coming up in the page which goes beyond the specified length. Have questions? Contact the software testing experts at InApp to learn more.

WebSocket SaaS | WebSocket Protocol Handshake

WebSocket SaaS

There are many technologies that the server will send data to the client at the very moment it knows that the new data is available such as push, comet, etc. These all create an illusion that the server initiated the connection called long polling. With the long polling, the client opens an HTTP connection to the server and keeps it open until it receives a response. These techniques work quite well, we use them daily in applications such as Gtalk. All of these work-around have one problem such as they carry the overhead of HTTP and will not be suited for low-latency applications. WebSocket: Sockets into the web The WebSocket defines an API establishing a socket connection between a browser and a server. Simply, it is a persistent connection between the client and the server and both parties can send data at any time. It is an independent TCP-based protocol. Its only relationship to HTTP is that the handshake is handled using HTTP servers as an Upgrade request. WebSocket protocol handshake For establishing a WebSocket connection, the client sends a handshake request and the server sends back the handshake response as shown below. GET /chat HTTP/1.1 Host: inapp.com/ Upgrade: websocket Connection: Upgrade Sec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw== Sec-WebSocket-Protocol: chat Sec-WebSocket-Version: 13 Origin: http://old.inapp.com Server response HTTP/1.1 101 Switching Protocols Upgrade: WebSocket Connection: Upgrade Sec-WebSocket-Accept: HSmrc0sMlYUkAGmm5OPpG2HaGWk= Sec-WebSocket-Protocol: chat Getting started Create a WebSocket connection using javascript (supports only in modern HTML 5-enabled browsers). var connection = new WebSocket(‘ws://localhost:8080/test’); You might have noticed that ws:// which is the URL schema for websocket connection. There is also wss:// for a secure WebSocket connection. // When the connection is open, send some data to the server connection.onopen = function () { connection.send(‘Ping’); // Send the message ‘Ping’ to the server }; // Log errors connection.onerror = function (error) { console.log(‘WebSocket Error ‘ + error); }; // Log messages from the server connection.onmessage = function (e) { console.log(‘Server: ‘ + e.data); }; The Server-Side With the release of Windows Server 2012 and Windows 8, Internet Information Services (IIS) 8.0 has added support for the WebSocket Protocol. http://www.iis.net/learn/get-started/whats-new-in-iis-8/iis-80-websocket-protocol-support The apache-WebSocket module is an Apache 2.x server module that may be used to process requests using the WebSocket protocol (RFC 6455) by an Apache 2.x server. https://github.com/disconnect/apache-websocket Also, Tomcat 7 provides API for WebSocket but has not yet been finalized. http://tomcat.apache.org/tomcat-7.0-doc/web-socket-howto.html Have questions? Contact the technology experts at InApp to learn more.

Mobile App Installation Checklist

Mobile App Installation Checklist

Mobile App Installation Checklist: Ensure the test device is not the same as used for development or is is not set up as the development environment. Verify that application can be installed successfully following normal installation procedures. Verify that version number matches the version specified during submission Verify the application is seen in the installed applications list. Verify whether proper alert is displayed when we doesn’t follow the normal installation procedures. Check installation with low wifi connectivity. Test installation behaviour with wifi in disconnected mode. Check uninstallation and reinstallation. Check Application start/stop behavior — Start the application by selecting the icon or following the steps outlined in the submission statement. Check installation behaviour when receiving voice calls — while installation process is in progress, make a call to the test device. Check installation behaviour when receiving text messages — while installation process is in progress, send a text message to the test device. Check if the app is supported on an older firmware (ie: iOS 3.1.3), especially if it is part of the requirement, else an intelligent message should be displayed to the user.

How to edit more than 200 rows in SQL Server Management Studio 2008

How to edit more than 200 rows in SQL Server Management Studio 2008

There are two options to edit more than 200 rows in SQL Management Studio 2008 Option 1-changing the setting of 200 rows permanently: Tools–>options–>SQL Server object explorer –>CommandsEdit “Change Value for Edit Top <n> Rows Command” Option 2-changing the setting of 200 rows temporarily: Right-click Table–>click on Edit Top 200 Rows –>New Query window will be opened. You can change the SELECT TOP (n) statement. After changing it, click on the red exclamation mark (!) to update the selection. Disable “Prevent saving changes that require the table to be re-created” By default, Microsoft SQL Server Management Studio has the option “Prevent saving changes that require the table to be re-created” enabled. This causes an error when you want to save changes to a table that require the table to be dropped and re-created. To turn off the setting, go to (on the menu bar) Tools/Options/Designers/Table and Database Designers, then under the Table Options section, uncheck Prevent saving changes that require the table to be re-created. Have questions? Contact the technology experts at InApp to learn more.

Insert Generator Script

Insert Generator Script

By executing the following stored procedure we can create insert statement for all records in a table EXECUTE [InsertGenerator] ‘tableName’. SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE PROC [dbo].[InsertGenerator] (@tableName varchar(max)) as –Declare a cursor to retrieve column specific information for the specified table DECLARE cursCol CURSOR FAST_FORWARD FOR SELECT column_name,data_type FROM information_schema.columns WHERE table_name = @tableName OPEN cursCol DECLARE @string nvarchar(max) –for storing the first half of INSERT statement DECLARE @stringData nvarchar(max) –for storing the data (VALUES) related statement DECLARE @dataType nvarchar(max) –data types returned for respective columns SET @string=’INSERT ‘+@tableName+'(‘ SET @stringData=” DECLARE @colName nvarchar(max) FETCH NEXT FROM cursCol INTO @colName,@dataType IF @@fetch_status<>0 begin print ‘Table ‘+@tableName+’ not found, processing skipped.’ close curscol deallocate curscol return END WHILE @@FETCH_STATUS=0 BEGIN IF @dataType in (‘varchar’,’char’,’nchar’,’nvarchar’) BEGIN –SET @stringData=@stringData+””””’+isnull(‘+@colName+’,””)+”””,”+’ SET @stringData=@stringData+””+”’+isnull(””’+””’+[‘+@colName+’]+””’+””’,”NULL”)+”,”+’ END ELSE if @dataType in (‘text’,’ntext’) –if the datatype is text or something else BEGIN SET @stringData=@stringData+””””’+isnull(cast([‘+@colName+’] as varchar(2000)),””)+”””,”+’ END ELSE IF @dataType = ‘money’ –because money doesn’t get converted from varchar implicitly BEGIN SET @stringData=@stringData+”’convert(money,”””+isnull(cast([‘+@colName+’] as varchar(200)),”0.0000”)+”””),”+’ END ELSE IF @dataType=’datetime’ BEGIN SET @stringData=@stringData+”’convert(datetime,’+”’+isnull(””’+””’+convert(varchar(200),[‘+@colName+’],121)+””’+””’,”NULL”)+”,121),”+’ END ELSE IF @dataType=’image’ BEGIN SET @stringData=@stringData+””””’+isnull(cast(convert(varbinary,[‘+@colName+’]) as varchar(6)),”0”)+”””,”+’ END ELSE –presuming the data type is int,bit,numeric,decimal BEGIN SET @stringData=@stringData+””+”’+isnull(””’+””’+convert(varchar(200),[‘+@colName+’])+””’+””’,”NULL”)+”,”+’ END SET @string=@string+'[‘+@colName+’],’ FETCH NEXT FROM cursCol INTO @colName,@dataType END DECLARE @query nvarchar(MAX) SET @query =’SELECT ”’+substring(@string,0,len(@string)) + ‘) VALUES(”+ ‘ + substring(@stringData,0,len(@stringData)-2)+”’+”)” FROM ‘+@tableName exec sp_executesql @query CLOSE cursCol DEALLOCATE cursCol Have questions? Contact the technology experts at InApp to learn more.

How to Clear SQL Server Transaction Log

How to Clear SQL Server Transaction Log

In some cases, the Microsoft SQL Server Transaction Log (.LDF) file becomes very huge. It’s wasting a lot of disk space and causing some problems if you want to back up and restore the database. We can delete the log file and create a new log file with the minimum size. To delete SQL server transaction log files, follow the steps given below: Backup the database Detach the database Right-click on the database => Tasks => Detach 3. Delete or rename the big log file (path:C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA) 4. Attach the database againRight-click Databases => Attach On Attach Database box, click Add… Browser to the database (.mdf) file then click OK Select the log (.ldf) file then click Remove Finally, click OK. You can see the new log file with the minimum size. Have questions? Contact the technology experts at InApp to learn more.

Software Test Metrics | Defect Metrics | Defect Slippage Ratio

Software Test Metrics | Defect Metrics | Defect Slippage Ratio

Introduction: Metrics can be defined as “STANDARDS OF MEASUREMENT”. A Metric is a unit used for describing or measuring an attribute. Test metrics are the means by which software quality can be measured. The test provides visibility into the readiness of the product and gives a clear measurement of the quality and completeness of the product. What are Test Metrics? Test Metrics are the quantitative measure used to estimate the progress, quality, and other activities of the software testing process. Why do we Need Metrics? “You cannot improve what you cannot measure.” “You cannot control what you cannot measure” AND TEST METRICS HELP IN Take decisions for the next phase of activities Evidence of the Claim or Prediction Understand the type of improvement required Take decisions on process or technology changeset Types of Test Metrics Base Metrics (Direct Measure) Base metrics constitute the raw data gathered by a Test Analyst throughout the testing effort. These metrics are used to provide project status reports to the Test Lead and Project Manager; they also feed into the formulas used to derive Calculated Metrics. Ex: # of Test Cases, # of Test Cases Executed Calculated Metrics (Indirect Measure) Calculated Metrics convert the Base Metrics data into more useful information. These types of metrics are generally the responsibility of the Test Lead and can be tracked at many different levels (by module, tester, or project). Ex: % Complete, % Test Coverage Metrics Life Cycle Defect Metrics Release Criteria Defect Pattern Test Plan Coverage on Functionality: The total number of requirements v/s number of requirements covered through test scripts. (No of requirements covered / total number of requirements) * 100 Define requirements at the time of Effort estimation Example: Total number of requirements estimated is 46, the total number of requirements tested 39, and blocked 7…define what is the coverage? Note: Define requirements clearly at the project level Test case Defect Density: The total number of errors found in test scripts v/s developed and executed. (Defective Test Scripts /Total Test Scripts) * 100 Example: Total test script developed 1360, total test script executed 1280, total test script passed 1065, total test script failed 215 So, the test case defect density is 215 X 100 —————————- = 16.8% 1280 This 16.8% value can also be called test case efficiency %, which depends upon the total number of test cases that uncovered defects Defect Slippage Ratio: The number of defects slipped (reported from production) v/s number of defects reported during execution. Number of Defects Slipped / (Number of Defects Raised – Number of Defects Withdrawn) Example: Customer filed defects are 21, total defect found while testing is 267, and the total number of invalid defects is 17 So, the Slippage Ratio is[21/(267-17) ] X 100 = 8.4% Requirement Volatility: The number of requirements agreed v/s number of requirements changed. (Number of Requirements Added + Deleted + Modified) *100 / Number of Original Requirements Ensure that the requirements are normalized or defined properly while estimating Example: VSS 1.3 release had a total of 67 requirements initially, later they added another 7 new requirements and removed 3 from initial requirements, and modified 11 requirements. So, the requirement Volatility is (7 + 3 + 11) * 100/67 = 31.34% Review Efficiency: Review Efficiency is a metric that offers insight into the review quality and testing. Some organizations also use this term as “Static Testing” efficiency and they are aiming to get a min of 30% defects in static testing. Review efficiency=100* the Total number of defects found by reviews/Total number of project defects. Example: A project found a total of 269 defects in different reviews, which were fixed and the test team got 476 defects that were reported and valid. So, Review efficiency is [269/(269+476)] X 100 = 36.1% Efficiency & Effectiveness of Processes: Effectiveness: Doing the right thing. It deals with meeting the desirable attributes that are expected by the customer. Efficiency: Doing the thing right. It concerns the resources used for the service to be rendered Have questions? Contact the software testing experts at InApp to learn more.

Types of Project Metrics

Types of Project Metrics

Metric is an inevitable part of any piece of work being performed. It’s a system in place to measure the excellence or rather performance of work delivered. Any work that is not controlled and measured can prove the equivalent to incorrect work being delivered. Technology grows at a tremendous pace that enterprises always strive in keeping defined project metrics. Project metrics can be stated as pre-defined or identified measures or benchmarks, which the deliverable is supposed to attain in order to get the expected value. With clearly defined project metrics, the business groups are able to assess the success of a project. Though certain unstated measures like if the project was delivered on time and within the budget, existed ever since the advent of enterprises, the need for more analytics in this area has seen a high spike. There are different types of project metric analysis systems in place across the industry like costing, resource, hours-based, etc. Let me take you through some common project metrics that are much related to the person-hours delivered in a project. Effort Variance (Ev) It’s a derived metric, which gives you an alert of having control over the project. Let there be a project A with the below current attributes : Planned effort: 100 Actual effort: 150 Project progress percentage: 50 Therefore, at 50 % – 150 Hrs taken then at 100 % – X Hrs will be taken X = (100*150)/50 = 300 Hrs., where X is a predicted value for the effort within which the project is going to complete. Hence, the variance Ev = ((Actual – Planned)/Planned)*100 = ((300-100)/100)*100 = 200 % The variance predicted indicates that the project requires attention or it would complete at a much higher cost in terms of the effort delivered. Schedule Variance (Sv) The Schedule variance also has the same calculation in which the number of days is considered instead of the hours. Weighted Defect Rate WDR is a defect metric calculated based on the weightage assigned to the reported bugs. The weightage depends on two factors – severity and reporter. Weightage against severity in descending order: Block, Crash, Major, Minor Weightage against the reporter in descending order: Client, SQC, Team And the rate is calculated against the total planned hours for the project. Quality Costs: Cost of Quality: It’s the total time spent on review activities in the project. Examples are requirements review, design review, code review, test plan review, team meetings for clarifications and client calls, etc. COQ = (Total Review hrs / Total project planned hours)*100 Cost of Detection: The total time spent on the testing activity is considered the cost of detection. Cost of Detection= (Total Testing Hrs/ Total Project planned Hrs)*100 Cost of Failure: The total time spent on rework in the project is considered the cost of failure. Rework includes bug fixing, design change, test plan change, etc. Cost of Failure (Cost of Poor Quality- CoPQ) = (Total Rework or bug fixing Hrs/ Total Project Planned hours)*100 Have questions? Contact the technology experts at InApp to learn more.

InApp India Office

121 Nila, Technopark Campus
Trivandrum, Kerala 695581
+91 (471) 277 -1800
mktg@inapp.com

InApp USA Office

999 Commercial St. Ste 210 Palo Alto, CA 94303
+1 (650) 283-7833
mktg@inapp.com

InApp Japan Office

6-12 Misuzugaoka, Aoba-ku
Yokohama,225-0016
+81-45-978-0788
mktg@inapp.com
Terms Of Use
© 2000-2026 InApp, All Rights Reserved