13.07.2015 Views

SIMULATED FIELD ENVIRONMENT TEST AUTOMATION ... - QAI

SIMULATED FIELD ENVIRONMENT TEST AUTOMATION ... - QAI

SIMULATED FIELD ENVIRONMENT TEST AUTOMATION ... - QAI

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>SIMULATED</strong> <strong>FIELD</strong> <strong>ENVIRONMENT</strong><strong>TEST</strong> <strong>AUTOMATION</strong> & CONTINUOUSREGRESSION <strong>TEST</strong>ING12th Annual International Software Testing 2012Ethiraj AlwarMatthew Houghton


AGENDANeed for Simulated field environment?Challenges of Test OrganizationsTraditional approach to AutomationTraditional Automation Vs Simulated Field EnvironmentSimulated Field Environment in LTE System VerificationEnd User ViewDesignImplementationPracticeBenefitsResultsSummary


Regression - ChallengesRegression is one of the key components of any test organizationRegression Effort increases due to:•Introduction of new features and enhancements•Support of new platforms and configurations of the products•Frequent integration and buildsOptimizing the regression effort is a challengeAutomation is a key enablerTraditional approach to automation is not effective solutionSimulated Field Environment approach is an innovative alternative.


Simulated Field EnvironmentUnique approach to Test AutomationKey Guiding principles:Close to the Field Environment (asynchronous nature)Continuous MonitoringCan be applied to System levelFunctional or PerformanceKey benefitsIncreased Test EffectivenessImproved defect find - Reliability & Availability related defectsIncreased Opportunity time


Traditional Automation Vs SimulatedField EnvironmentContinuous RegressionBatch ModeSimulated Field EnvironmentTest ScriptsTraffic Profiles


Traditional Automation Vs SimulatedField EnvironmentAttribute Traditional Automation Simulated Field EnvironmentApproachTestCoverageNature of theTest TrafficFeatureInteractionLogCollectionMore focused ontraditional test approachFocused on a specifictest scenario/test caseInstantaneous andSequentialOnly as scriptedCollects execution logsfor the specific test casethat is executedMore close to a Field environmentFocused on triggeringasynchronous events from thedifferent control points involved inthe Product/System under testContinuous and asynchronousInherentContinuous Log collection


Traditional Automation Vs SimulatedField EnvironmentAttributeKeyPerformanceIndicatorsTraditionalAutomationFocused on verifyinglogs/test scenariosSimulated Field EnvironmentFocused on analyzing and arrivingat Key Performance Indicators andsystem health parametersMonitoring Focused on Testing Focused on MonitoringVerification Can only be used to Can assess the reliability andaspectsverify the functionality stability of the system. Functionalityis continuously verifiedApplicability Applicable for Product Applicable for System levellevel verification verificationCycle Time Serial (Grows seriallywith the number oftest cases)New scenarios are integrated intoexisting profile, cycle increases onlywhen a new profile is required, andthen only in increments of hours.


LTE SFE – End User View – 15 mins…Monitor the Overall health of the systemFM server logsCM server logsDHCP messages count andack countSDL server logsNew eNB-PM addedNew eNB-PM removedCLI commandsummarySDL status (snapshot ,every 15mni)HW problem by fmadm cmdFM status (snapshot , every 15min)CM status (snapshot , every15mni)Active/standby serverstatusActive/standby server statusActive/standby server statusLDAP user detailsShows *any* processes which is NOT online (based on asnapshot ran every 15min)Monitor the Traffic ProfileList of eNB versionList of IM versionIndication of any OAMprofilerunningNBI config infoNBI alarm analysis basedon sniffer logsLinks for the alarm w/in15mininterval.Based on CLIcmdTacacs config checkIndication if anyprocesses wentup/down


LTE SFE - DesignTContinuous TrafficProfileTest EnvironmentCall ProcessingOperations & MaintenanceTContinuous LogCollectionHSSMMEPCRFSGE & PGWNBIEMSContinuousTLog Analysis•eNB eNB•eNB eNBDongle DongleDongleDongle• System Verification verifies the CP and OM functionality in E2E lab configuration• Collection of logs through the SFE framework enables effective debugging and root-causing of issues.TWeb based KPIPage Monitoring


LTE SFE - ImplementationSFE-Test automation framework CP Traffic Profile OM Traffic Profile Log Collection Monitoring & AnalysisIMS-simNTP/DNS/DHCPserversApplication Servers(IPERF, Video, FTP, IMS..)App Transport NetworkHSS/PCRF(PC-based sim)PGWGWSGWSnifferPC(wireshark/tshark)Logs areconstantly feedingto users throughSFE-KPISnifferPC(wireshark/tshark)SV Lab Network (RDNet)Monitoring &Analysis KPICore Transport NetworkLTEmgrNE-log-collection(MME, LTEmgr..etc)Web-based KPIinterface for system-analysis and logs accessMMEMMESnifferPC(wireshark/tshark)KPIMLSMLSL2SWSnifferPC(wireshark/tshark)eNB eNB eNB eNBeNBpc (LMT)GeneralOAM-TrafficProfileOffice NetworkProgrammable AttenuatorsUE-Traffic profileRF-AttenuatorTraffic ProfileGUI-TrafficProfileRemote accessibleUEsUE- Traffic profileeNB-Cluster / UE-benchSnifferT-tap


PlanningLTE SFE – PracticeTest ExecutionSFE Bench SetupLabManagementSFE BenchInstall/UpgradeTest CaseRepositoryTestManagementMonitor SFE KPI &Update resultsSFE Engineers(1-2 Engrs)ResourceManagementContinuousRegression


LTE SFE - BenefitsSimulated Field Environment is both effective and efficientTest Effectiveness: Defect Find RateTest Efficiency: Execution Productivity


LTE SFE - ResultsMetricBenefitsMaturity • Cornerstone of regression for E2E CDMA Infrastructure since 2005.• 2 nd generation framework designed and implemented in 2009 fortesting E2E LTE Infrastructure (7 releases and counting)Incremental EffortTest EfficiencyTest Effectiveness• Initial effort is fully aligned with designing and implementing thetest environment. In our case the SFE was designed andimplemented by a team of 4 test engineers.• A pair of engineers (2) maintain and monitor a single SFE profile• Similar improvements in efficiency to traditional approaches toautomation.• Continuous nature of this approach yielded a dramatic increase inreliability / availability defects (well over 100%)• Significant increase in functional defects found (over 30%)


LTE SFE – Return on InvestmentCategoryInitial Year(USD)Subsequent Years(USD)Equipment Cost $25K NoneEngineering Cost (4 eng) $288K $288KReduction in Cost of Poor Quality $900K $1,800KReturn on Investment $587K $1,512K Equipment and Engineering costs are incremental to more traditionalapproaches to automation. Engineering cost is mainly for SFE test environment development.Engineering cost of 72K USD per annum is considered. Reduction in Cost of Poor Quality due to incremental increase in defects found(average of 1.5 per month)Complex defects that would not be found in more traditional environmentsOtherwise would have become escaped defects (cost of $50K per escape)Partial (50%) return in initial year; full return in subsequent years


SummarySFE is an innovative approach to automationShifts focus from “Traditional” paradigm to “SFE” ParadigmSFE Components and implementationImplement the Continuous Traffic ProfileDesign and implement SFEIdentify KPI and implement KPI Web PageTrain the engineers in SFE Profile MonitoringLeverage all the techniquesTest OptimizationManual Test ExecutionBatch ModeSimulated Field Environment


AcknowledgementsContributions from Pak Hui, Shyam Tak and Alex Reyther - whodeveloped the SFE Framework for the CDMA & LTE NetworkInfrastructure – to the content of this paper.Several people have been involved in the SFE automation effortsover many years, not all of can be individually named. We would liketo mention the contributions from the automation team - B RPrashanth, Ankit Kapoor, Kavitha N & teamEfforts from LTE Functional Regression Team including Amit Das,Shailesh Kumar, Vivek Bangera.


Thank You!!!

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!