Spirent Communications NOMADHD-01 Table Top HW device for Voice Quality and Call Performance User Manual Part 2

Spirent Communications Inc Table Top HW device for Voice Quality and Call Performance Part 2

Contents

User Manual Part 2

 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    79  8 Analyzing the Results Once a suitable amount of data has been collected, use the Stop   button to terminate data collection.  If the current test has been set to end after a specific number of cycles or calls, data collection will stop automatically at the conclusion of the test sequence.    When a Voice Quality test is stopped, the End Active Calls dialog will appear.  It is critical to hang up all active Voice Quality calls before clicking OK to close this dialog.  Failure to disconnect the calls prior to closing this dialog will result in data missing from the merged output report.   Figure 8-1 - End Active Calls Dialog  When data collection stops, the Nomad Data tab will appear for report generation and additional analysis.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    80  8.1 Generating a Report In most cases, Nomad will automatically merge the downlink data collected at the test PC with the uplink data collected at the Audio or Call Server provided that an internet connection is available at merge time.  The merged file will be listed on the Data → Complete tab and will be available for report generation.  In addition to the File Name, the UTC Start Time and End Time and the task type performed at each channel is listed for identification purposes.  The listed reports may be sorted using the column headers  (If the session file appears on the Data → Incomplete tab, see Section 8.8 for troubleshooting tips).   Figure 8-2 – Merged Session Files on the Complete Tab
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    81  To generate a report using merged session files:   Select one or more files to be included in the output report.  Click the Generate Report From Selected   button.  If multiple session files have been selected, choose whether to Merge multiple logging sessions into one report or to Produce one report for each logging session.   Figure 8-3 - Output File Type Selection   The Review Merged Session Report Header Data dialog will appear.  Channels with inconsistent header data will be highlighted yellow.  This dialog may also be used to move data among channels for reporting purposes. o Examine the data for any marked inconsistencies.  Remove any unwanted data from the merge or use the Move Selected button to move data to a different channel.  Moving data may be necessary if the same device was used on different channels during different test sessions. o The Move Selected button may also be used to move data among channels for reporting purposes.  Use this option to aggregate data collected across multiple channels in the report output. o Provide a unified Name for the channel data in the report. o Click Generate Report when all data has been aligned as desired.  If multiple test session files have been selected, the Save As dialog will appear.  Name the Nomad report source file, or text file containing all data for the unified output report.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    82   Figure 8-4 - Review Merged Session Report Header Data Dialog   The source files used to generate the formatted output report and KML file for geographic visualization of MOS data will be listed on the Data → Reports tab.  A report file created from a single test file will take the name of that file.  A report file created from multiple test files will take the name assigned by the user during the report generation process.   Figure 8-5 - List of Generated Reports on the Reports Tab  8.2 Displaying a Formatted Report To display the formatted report output:   On the Data → Reports tab, click the View Standard link for the data of interest.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    83   Note:  If the View Standard link is not present, reports must be generated by a team member with access to this functionality.  Contact Spirent Support if you believe this capability has been omitted in error.  In the Save As dialog, provide a name for the formatted Excel report.  Note:  Formatted output reports are displayed in .XLSX format.  Microsoft Excel 2007 or newer is required to open files in this format.  If you are running an older version of Excel, you must download the Microsoft Excel Viewer to view the .XLSX output reports.   Figure 8-6 - Naming the Formatted Excel Report   Nomad supports report template customization.  If multiple customized report templates are available, the Report Template Selection dialog will appear.  Select the desired template to use.  See Section 8.3 for details on customizing report templates.   Figure 8-7 - Report Template Selection Dialog  The report will open to the Device Overview tab.  This page lists the metadata entered for each test device, and provides links to each of the Voice Quality and Call Performance reports.  Click on any report Name to go directly to that report.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    84   Figure 8-8 - Device Overview Report Landing Page  Formatted Voice Quality data can be found on the Voice Quality Summary tab of the output and includes statistical and distribution information for Voice Quality session results.   Figure 8-9 - MOS Statistical Summary
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    85   Figure 8-10 - Downlink MOS Distribution   Figure 8-11 - Uplink MOS Distribution  Formatted Delay Performance data can be found on the Delay Performance Summary tab.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    86   Figure 8-12 - Delay Performance Summary Report  Formatted Call Performance data can be found on these tabs:   Call Performance Summary  Call Initiation  Call Retention  Audio Verification  Device Performance
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    87   Figure 8-13 - Call Performance Summary Report   Figure 8-14 - Call Initiation Performance Report
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    88   Figure 8-15 - Call Retention Performance Report   Figure 8-16 - Audio Verification Performance Report
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    89   Figure 8-17 - Device Performance Report  8.3 Report Template Customization Nomad supports report template customization.  To customize a report template:   Find the default Nomad report template in the Templates directory of the Nomad installation location (typically C:\Program Files\Spirent Communications\Nomad).  The name of the file is Nomad-Template.xlsx.  Make a copy of the Nomad-Template.xlsx template file in the same directory.  Rename the copy with a meaningful name.  Open the new file and edit the formatted report pages using Excel 2007 (.XLSX format).  Note that existing worksheet names must remain the same but new sheets may be added.  When the changes are complete, save the Excel file.  When more than one report template is available in the Templates folder, Nomad will present the Report Template Selection dialog during the report generation process.  At that time, any custom template may be selected for data population.   Figure 8-18 - Report Template Selection Dialog
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    90  8.4 Interactive  Viewer To display the Interactive Viewer:  On the Data → Reports tab, click the Interactive button for the data of interest.   Figure 8-19 – Interactive button on Reports tab  This will display the Interactive Viewer, which will represent cycle level data from each channel in the session report.  For Voice Quality data, a graph will be shown displaying the MOS scores over time.  Hovering over a point on the graph will display the MOS score for that cycle.  Clicking on a point in the graph will display information at the bottom of the window relative to that cycle.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    91    Figure 8-20 – Interactive Viewer Voice Quality data  For Call Performance data, a table will be shown displaying the results of each call:
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    92   Figure 8-21 – Interactive Viewer Call Performance data  8.5 Visualizing Data Geographically The Nomad KML output capability provides a simple method of visualizing MOS and Call Performance data geographically.    Google EarthTM must be installed in order to take advantage of geographic visualization.  Google EarthTM may be obtained from:  http://earth.google.com  An internet connection is required for map access while running the program.  To visualize data in Google EarthTM:   On the Data → Reports tab, click the View link in the KML column for the data of interest.  In the Save As dialog, provide a name for the KML map file.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    93   Figure 8-22 - Naming the KML Map File   The KML Generation Complete dialog will appear.  Choose whether to view the data now.   The data can be viewed at any time by opening the .KML file using the File → Open command from the main menu in Google EarthTM.   Figure 8-23 - KML Generation Complete Dialog  Google EarthTM will open and zoom to the map location.  MOS measurements or Call Performance events will be shown as color-coded points along the drive route.  The color thresholds for each MOS range are determined by the thresholds set on the Settings → Voice Quality tab (see Section 4.4.1).  Colors for Call Performance events are determined by the system.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    94   Figure 8-24 - Geographic Display of MOS Data in Google EarthTM   In the Places window in the left-hand panel, use the selection boxes to isolate the data by channel and link.  For example, this map has been customized to include only Downlink data for Channel 1.   Figure 8-25 - Isolating the Data of Interest   The map image may be saved as a .JPG file by selecting File → Save → Save Image. 8.6 Reporting IP Analytics If the IP Analytics right is present, collected IP data can be correlated and reported with the session results from Nomad.  To correlate IP data:
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    95   On the Data → Complete tab, select one or more files with which you wish to associate data in a pcap file.    Click the Import IP Analytics File button.  Navigate to a pcap file that was collected on the device or server side during the session(s) selected.  The data from the pcap file will be analyzed and correlated with the time period of the selected session(s).  To generate a report displaying the IP Analytics data correlated with Nomad results:  On the Data → Complete tab, select one or more files that you wish to generate the report on.  Click the Generate Report From Selected   button.  The Review Merged Session Report Header Data window will appear (for more information on the general functionality of this window, see section 8.1)  The streams of imported IP data will be listed in the IP Analytics table at the bottom of the window.  For each channel applicable (the tabs on the left), select the Downlink or Uplink radio button in the IP Analytics table for the corresponding streams.   Figure 8-26 – Associating IP Analytics data   Press the Generate Report button.  On the Save As window, provide a location and name for the report to be generated, and then click the Save button.  A report with the provided name will be displayed on the Data → Reports tab.  Click the View IP link to view an excel report containing the IP Analytics data and associated Nomad results.  Click on the Interactive button to view an interactive representation of the IP Analytics data and associated Nomad results.  The generated Excel file will display graphs of Nomad results, and graphs of corresponding Jitter, Delay, and Throughput statistics for the corresponding time period.  At the top of the excel report, you can select the starting cycle, and amount of cycles to be represented in the graphs.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    96   Figure 8-27 – Selecting Displayed data.  The IP Analytics data can also be viewed in the Interactive Viewer.  To view the data in the Interactive Viewer, click the Interactive button for the corresponding report on the Data → Reports tab.  For Call Performance tests, a table will be displayed for each call result.  To display the associated stream data, click on a call result within the table.  Data on any associate Uplink and Downlink streams for the selected result will be displayed beneath the table.   Figure 8-28 – Interactive Viewer – Call Performance data
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    97   These charts include RTP Throughput, RTP Jitter, and RTP Relative Delay over time, as well as charts of the distribution of RTP Relative Delay and Distribution of RTP Jitter.  For Voice Quality tasks, a graph will be displayed displaying the MOS scores for the task.  Beneath the graph will be displayed the above mentioned charts for the duration of the task for all associated streams (Uplink and Downlink).  Clicking on any MOS score in the top graph will filter the remaining charts to the data corresponding the cycle.  You may restore the charts to represent the entire task by clicking the Restore Channel View button.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    98    Figure 8-29 – Interactive Viewer – Voice Quality data
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    99      8.7 Collecting Random Audio Sample Files (Voice Quality Testing) Nomad has the ability to set aside a random sample of audio files from a Voice Quality test within each MOS performance range.  This allows users to audibly verify the performance of Great, Good and Bad audio samples.  Random audio file sampling must be enabled prior to data collection on the Settings → Voice Quality tab in order to capture this data (see Section 4.4.1).  Start and run the test normally.  After the uplink and downlink data for the test has been merged and the session file appears on the Data → Complete tab, a ZIP containing the sample uplink and downlink audio files will be available in the file’s details directory.  This folder resides in the log file storage location specified in the Start Logging Session Wizard (see Section 6.4) and takes the same name as the session log file.  The ZIP file contains a directory structure that organizes the audio samples by channel and by performance threshold.  The .WAV audio samples reside within the directory structure.  Some notes about the directory structure containing the audio samples:   If the specified number of files does not exist for a range, Nomad will include all available .WAV files for that range.  If no samples fall within a range, no folder will be present for that range.  The files can be played with any media player that supports .WAV files.   Figure 8-30 - Random Audio Sample Directory Structure
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    100  8.8 Viewing the Logs The log file containing the messaging for each test is stored in the Log File Storage Location specified in the Start Logging Session Wizard (see Section 6.4).  The log files are maintained in an open-format, comma-delimited file for direct access to the information.   Figure 8-31 - Nomad Test Log  8.9 Waveform Analysis with the PESQ Tools GUI (Voice Quality Testing) The Psytechnics PESQ Tools GUI is available for Nomad users wishing to perform detailed analysis of any waveform captured during Voice Quality testing.   PESQ takes into account signal degradation caused by coding distortions, error, packet loss delay and filtering in analog network components.  The PESQ Tools GUI can be useful in understanding why a sample was scored the way that it was.  This section describes the operation of the GUI and the most relevant analyses that can be performed with the tool.  Note:  This tool is only applicable to the PESQ scoring method.  At this time, there is no comparable tool for POLQA.  Note:  At this time, the PESQ Tools GUI is not being packaged within Nomad.  Please contact Spirent Product Support to obtain the PESQ Tools GUI installation.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    101   Figure 8-32 - PESQ Tools GUI  The PESQ algorithm calculates MOS by comparing audio samples degraded by the communication channel to the original source sample.  Therefore, the PESQ Tools GUI requires both the original and the degraded samples as inputs.  To analyze a degraded sample:  1. Open the Reference File  From the Ref File tab of the control panel, click on the Open   button in the Reference file path area.  The standard audio scoring files are stored in the C:\Program Files\Spirent Communications\Nomad\Audio Files directory.  Select narr_usasts_107dB.wav for narrowband handsets (8 kHz) or wide_usasts_107dB.wav for wideband handsets (16 kHz).   The Hardware Sample Rate (8 kHz or 16 kHz) can be confirmed on the Nomad Settings → Voice Quality tab.   Figure 8-33 - Ref File Tab
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    102  2. Open the Degraded File  From the Deg File tab of the control panel, click on the Open   button in the Degraded file path area.  Browse to the storage location for the degraded log file and open the file.  Degraded session log files are stored in the Log File Storage Location specified in the Start Logging Session Wizard (see Section 6.4).   Figure 8-34 - Deg File Tab  3. Run the Analysis  On the Controls tab of the PESQ Tools GUI control panel click Run   to run the analysis.   Figure 8-35 - Controls Tab  The PESQ Tools GUI can assist in the identification of issues that contribute to low MOS such as background noise and speech clipping.  The case study in Appendix D illustrates how the PESQ Tools GUI can be used to identify these types of issues in a test file exhibiting low MOS results.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    103  8.10 Troubleshooting Incomplete Data On occasion, Nomad may be unsuccessful in merging the uplink and downlink data from a test session into a unified log file for report generation.  Such files will appear on the Data → Incomplete tab at the conclusion of a test (successfully merged files appear on the Data → Complete tab).   Figure 8-36 - Incomplete Session Files  Files on the Incomplete tab contain an icon signifying each problematic channel:   The half-circle   icon indicates that Nomad does not have access to all of the data required to merge this file.  This might appear if no calls were started for a channel during a test.  This might also appear in the case of a Remote Unit test where Nomad can access the locally collected data but not the remotely collected data.  The yellow triangle   icon indicates that an error has occurred during data collection preventing Nomad from merging the uplink and downlink data associated with this test case.  The most common reason for this error is lack of internet connectivity at merge time.  Another likely cause is the incorrect entry of the Phone Number Settings in the Session dialog on the Test Setup Tab (see Section 5.2.2).
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    104  Nomad provides several options to modify an incomplete logging session to render it suitable for report generation.  Click on the   or   indicator icon to invoke the Modify Incomplete Logging Session dialog.   Figure 8-37 - Modify Incomplete Logging Session Dialog  To troubleshoot issues signified by the yellow triangle :   If lack of internet connectivity is believed to be the reason for the   error, re-establish the internet connection and then use the Retry Auto-Complete button to merge the data and move to the Complete tab.  If the   icon indicates Unable to download uplink scores, the Phone Number Settings for the session are likely incorrect.  In this case, select the Edit button to Edit Channel Configuration.  Enter the correct values for Number to call and Number calling from and then Close the dialog.  Click Retry Auto-Complete to merge the data.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    105   Figure 8-38 - Edit Channel Configuration Dialog  To troubleshoot other incomplete data issues:   Data collected on a Remote Unit must be copied to the local machine for merging and report generation.  Once these files are available locally: o Use the Import Logging Session button to display the data on the Incomplete tab.  o Select both the local and remote data and choose Merge Selected. o The merged data will appear on the Complete tab and the report may be generated as described in Section 8.1.  Some files that are unable to be merged automatically can still be merged using the Uplink Scoring Utility.  See Section 8.9 for more information about using the Uplink Scoring Utility.  If you believe that a channel has been flagged as incomplete in error, use the Mark As Complete option to disregard the incomplete data warning.  Any report created using this button will be missing data.  In this case, Voice Quality tests will be missing uplink scores or CDR data, while Mobile Terminated Call Performance tasks will be missing call server results.  The Mark As Complete option should only be used if downlink data will be sufficient for analysis.  The Retry Auto-Complete and Mark Selected As Complete options are also available at the bottom of the Data → Incomplete tab.  In general, incomplete data should be fixed before generating reports to ensure the integrity of the data.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    106  8.11 Offline Scoring (Voice Quality Testing) The ability to generate MOS outputs for .WAV files outside of an active test session is called Offline Scoring.  Use Offline Scoring to retrieve .WAV files stored on the Audio Server when uplink results are not available during data collection and data must be retrieved after test completion.  Uplink .WAV files are stored on the Audio Server for three weeks after collection.    To perform Offline Scoring for files located on the Audio Server, start by accessing the Uplink Scoring Utility:   From the Data → Utilities tab, highlight Uplink Offline Scoring and click Launch Utility.  Alternatively, access the Uplink Scoring Utility from the Modify Incomplete Logging Session dialog on the Data → Incomplete tab.  On the Settings screen of the Uplink Scoring Utility:   Confirm that the Audio Server settings are correct using the Validate Settings button.  Select the Sample Rate as either 8 kHz (narrowband) or 16 kHz (wideband).   Select the Scoring Model as either PESQ or POLQA.  Select the appropriate narrowband or wideband reference audio file from the Audio Files directory within the Nomad installation (likely C:\Program Files\Spirent Communications\Nomad\Audio Files)  Select a Logging Session option: o Create a new logging session using offline scoring   This option creates a local uplink file based on data stored on the Audio Server for the selected call session and date / time range.  The uplink .WAV files are also downloaded to the local machine.  This option is to be used when the downlink file is not readily available, but the collection times are known.  This option might be selected when testing has been performed in the field, but an office-based engineer requires access to the uplink results and / or .WAV files.  See Section 8.9.1 to create a merged uplink file using this method. o Retrieve complementary uplink data for an existing Logging Session  This option creates a local merged uplink-downlink file based on data stored on the Audio server which corresponds to a locally stored downlink file.  The uplink .WAV files are also downloaded to the local machine.  This option is to be used when the downlink file corresponding to the desired uplink file is readily available.  This option might be selected when local access to the .WAV files is required.  See Section 8.9.2 to create a merged uplink file using this method.  Click Next to continue.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    107   Figure 8-39 - Uplink Scoring Utility Settings Screen  8.11.1 Creating a New Logging Session Using Offline Scoring On the Session Selection Screen of the Uplink Scoring Utility:   Select the start and end date/time values in UTC.  These can be found by opening the corresponding downlink log file and noting the timestamps of the first and last entries.  Enable the channels for which to retrieve data.  Specify the session(s) for which to retrieve the audio files.  Each handset can be identified using the {4 Digit DNIS}-{10 Digit ANI}-{MMDD} convention with: o {4 Digit DNIS} – Last four digits of the Audio Server phone number called by the mobile. o {10 Digit ANI} – The phone number of the mobile phone being tested. o {MMDD} – The two digit month and two digit date.  Click Begin Scoring.  When the Offline Scoring process is complete, access the output file from the specified location.  The file will also appear on the Incomplete tab where it can be marked as Complete for report generation.   Figure 8-40 – Creating a New Logging Session Using Offline Scoring
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    108  8.11.2 Retrieving Complementary Uplink Data for an Existing Logging Session On the Session Selection Screen of the Uplink Scoring Utility:   Enable the channels for which to retrieve data.  The details of the downlink voice quality task will appear for each channel, including start and end time of data collection.  Specify the session(s) for which to retrieve the audio files.  Each handset can be identified using the {4 Digit DNIS}-{10 Digit ANI}-{MMDD} convention with: o {4 Digit DNIS} – Last four digits of the Audio Server phone number called by the mobile. o {10 Digit ANI} – The phone number of the mobile phone being tested. o {MMDD} – The two digit month and two digit date.  Click Begin Scoring.  When the Offline Scoring process is complete, access the output file from the specified location.  The file will also appear on the Incomplete tab where it can be marked as Complete for report generation.   Figure 8-41 - Retrieving Complementary Uplink Data for an Existing Logging Session
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    109  8.12 Scoring PESQ and POLQA It may be desirable to score Voice Quality test results using both the PESQ and POLQA scoring models.  On the first pass, the data will always be scored using the Scoring Model specified in the test definition.  These options exist to score the same data using the other Scoring Model:   Re-Score Files  Batch Scoring  Uplink Offline Scoring  8.12.1 Re-Score Files The Re-score Files option will re-process any file using the Scoring Model (PESQ or POLQA) not previously used.  To re-score a file:   On the Data → Complete tab, right-click on the file to re-score and select Re-score Files.  Click Next to accept the files to re-score.   Figure 8-42 - Files to Re-score   Click Start Scoring to begin processing the data.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    110   Figure 8-43 - Start Re-scoring   Figure 8-44 - Re-scoring in Progress   Find the re-scored file on the Data → Incomplete tab.  The file name will contain the word “rescored” for identification purposes.    Use the Retry Auto-Complete button to move the re-scored file to the Complete tab.  Generate a report from the re-scored file.
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    111  Nomad automatically detects and uses the Scoring Model not used to score the original file.  For example, if POLQA was used to score the original file, PESQ will be used when re-scoring.  The Scoring Algorithm in use can be verified on the Voice Quality Summary tab of the output report.  Note:  Although the PESQ and POLQA results for the same file cannot be output to the same report, two output reports may be viewed side-by-side (or data copied from one to another) in order to compare results.  8.12.2 Batch Scoring Nomad provides a Batch Scoring Utility to score multiple .WAV files using PESQ, POLQA or both.  The output of this utility is a delimited text file that may be viewed in raw form, opened in Excel or parsed with a script.  The Batch Scoring Utility may be used to score previously unprocessed files or for re-scoring.  Note:  The Spirent ME hardware unit must be attached in order for the POLQA and PESQ & POLQA scoring options to be available in the Batch Scoring Utility.  To batch process with this utility:   On the Nomad Data → Utilities tab, select Batch Scoring and click the Launch Utility button.  Select the appropriate narrowband or wideband reference audio file from the Audio Files directory within the Nomad installation (likely C:\Program Files\Spirent Communications\Nomad\Audio Files).  Select the Scoring Algorithm as PESQ, POLQA or PESQ & POLQA.    Use the Add Files or Add Directory button to browse for and open the .WAV files to score.  Click Next to accept the settings and proceed.   Figure 8-45 - Batch Scoring Configuration
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    112   Click Start Scoring to accept the settings and proceed.   Figure 8-46 - Batch Scoring in Progress   When scoring is complete, processing and MOS statistics will be presented for the selected scoring algorithm(s).   Figure 8-47 - Batch Scoring Complete
 Nomad User’s Manual  Chapter 8 – Analyzing the Results Copyright © Spirent Communications, Inc. 2013    113  Use the Open File button to view the processed data.  This delimited text file can be analyzed in its raw form, opened in Excel or used as the basis of a custom processing script.   Figure 8-48 - Batch Scoring Output  8.12.3 Offline Scoring The Nomad Offline Scoring utility provides a method of retrieving and scoring uplink .WAV files from the Audio Server outside of an active test session.  Files processed using Offline Scoring may be scored using either PESQ or POLQA.  Note:  The Spirent ME hardware unit must be attached in order for the POLQA scoring option to be available in the Offline Scoring Utility.  Offline scoring can be used to score previously unprocessed files or for re-scoring.  Select the Scoring Model on the Uplink Scoring Utility Settings Screen to choose between PESQ and POLQA.  See Section 8.9 for detailed information about the Uplink Offline Scoring utility.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    114  9 Voice Quality Configuration Options Nomad provides numerous configuration options for voice quality testing.  These options include:   Voice Quality Test Calibration  Audio Server Testing  Mobile-to-Mobile Testing  Remote Unit Testing  Landline Module Testing  Base Station Simulator Testing  Head and Torso Simulator Testing  Wideband Testing  Multi-RAB Testing  Voice Delay Testing  Adapter Kit  This section contains detailed instructions for each voice quality configuration option.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    115  9.1 Voice Quality Test Calibration Spirent Communications recommends calibrating the Spirent ME or Spirent HD ME hardware used with the Nomad system prior to testing for the first time to verify proper operation.  To calibrate the system:   Connect the provided Calibration Cable between Channel 1 and Channel 2 on the Spirent ME unit or between NB1 and NB2 (narrowband)  on the Spirent HD ME unit or between HD1 and HD2 (high definition) on the Spirent HD ME unit.  On the Nomad Test Setup screen, configure Channel 1 as a Mobile task with the following settings: o Session → Uplink Device:  Base task on channel 2 o Channel Settings → Audio interface for this channel:  Analog Interface for Nomad ME Units, Narrowband or High Definition for Nomad HD ME units. o Channel Settings → Input Level:  225 o Channel Settings → Output Level: 180 o Channel Settings →  Microphone Detect Mode:  Confirm that this option is unchecked  On the Nomad Test Setup screen, configure Channel 2 as a Base task with the following settings: o Session → Downlink Device:  Base task on channel 1 o Channel Settings → Audio interface for this channel:  Analog Interface for Nomad ME Units, Narrowband or High Definition for Nomad HD ME units. o Channel Settings → Input Level:  225 o Channel Settings → Output Level: 180 o Channel Settings →  Microphone Detect Mode:  Confirm that this option is unchecked  Start a new test.  The resulting MOS values for Channel 1 and Channel 2 should be 4.542 or higher, representing perfect audio.  If this is the case, the unit has passed.  Connect the Calibration Cable between Channel 3 and Channel 4 on the Spirent ME unit and repeat the test.  In the formatted Nomad output report generated with calibration test data, find results for audio received at Channel 1 in the Channel 1 – DL area.  Find results for audio received at Channel 2 in the Channel 1 – UL area.     Figure 9-1 – Hardware Configuration for Voice Quality Test Calibration on a Nomad ME unit
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    116  9.2 Audio Server Testing Spirent Communications provides access to a centralized call server that sources downlink data and records uplink data in the audio server testing configuration.  The mobile handset will source speech while the audio server will record and score the uplink speech sample for ten seconds.  In the next ten second cycle, the audio server will source speech while the mobile handset records and scores the downlink speech sample.  To perform Audio Server testing:   Connect the test handset to any active channel on the Spirent ME hardware unit via audio cable or Bluetooth.  On the Nomad Test Setup screen, configure that channel as a Mobile task with the following settings: o Session → Uplink Device:  Audio Server o Channel Settings → Audio interface for this channel:  Analog Interface(ME), Narrowband (ME HD), High Definition (ME HD)  or Bluetooth Interface as desired.  Note that a maximum of two Bluetooth devices is permitted for Voice Quality testing at one time.  If the test mobile is connected via analog audio cable, volume settings may be optimized using Auto-Level Assist.  Click the Start Logging Session button and proceed through the Start Logging Session Wizard.    At the conclusion of the Start Logging Session Wizard, Nomad will automatically place test calls for devices which are connected via Bluetooth and which have been configured for auto-dial on the Settings → Voice Quality tab.  All other calls must be manually placed to the Audio Server.  The test sequence starts with the mobile handset sourcing data to the Audio Server.  In the second half of the cycle, the Audio Server will source data to the mobile handset.  Downlink data collected at the handset is displayed in the Voice Quality Task Status Window during the following cycle.   Figure 9-2 - Voice Quality Task Status Window   During testing, adjust the downlink volume using the Settings → Levels dialog such that the Insertion Gain (PESQ) or Attenuation (POLQA) falls as close to 0 as possible.  The Input slider should sit within the Normal Operating Range values displayed.  End the test when desired.  Test results can be found on the Voice Quality Summary tab of the output report.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    117  9.3 IMS Client Testing Nomad provides the option for Utilizing a local IMS Client as the Uplink Device.  The mobile handset will source speech while the IMS Client will record the uplink speech sample for ten seconds, acting in the traditional role as the voice server.  In the next ten second cycle, the IMS client will source speech while the mobile handset records the downlink speech sample. The uplink and downlink speech sample recordings are scored by Nomad.  To perform testing utilizing the local IMS Client as Uplink Device:   Connect the test handset to any active channel on the Spirent ME hardware unit via audio cable or Bluetooth.  On the Nomad Test Setup screen, configure that channel as a Mobile task with the following settings: o Session → Uplink Device:  IMS Client o Channel Settings → Audio interface for this channel:  Analog Interface(ME), Narrowband (ME HD), High Definition (ME HD)  or Bluetooth Interface as desired.  Note that a maximum of two Bluetooth devices is permitted for Voice Quality testing at one time.  Click the Start Logging Session button and proceed through the Start Logging Session Wizard.    The test sequence starts with the mobile handset sourcing data to the IMS Client.  In the second half of the cycle, the IMS Client will source data to the mobile handset.  Downlink data collected at the handset is displayed in the Voice Quality Task Status Window during the following cycle.   During testing, adjust the downlink volume using the Settings → Levels dialog such that the Insertion Gain (PESQ) or Attenuation (POLQA) falls as close to 0 as possible.  The Input slider should sit within the Normal Operating Range values displayed.  End the test when desired.  Test results can be found on the Voice Quality Summary tab of the output report. 9.4 Mobile-to-Mobile Testing Nomad provides the option for Mobile-to-Mobile testing.  One handset will source speech while the other will record speech for ten seconds.  The process will alternate in the next ten second cycle.  Both the sourcing and recording is handled by a single Nomad installation.  To perform Mobile-to-Mobile testing:   Connect one handset to Channel 1 of the Spirent ME hardware unit via audio cable.  Connect the second handset to Channel 2.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    118   Figure 9-3 - Mobile-to-Mobile Hardware Configuration   On the Nomad Test Setup screen, configure Channel 1 as a Mobile task with the following settings: o Session → Uplink Device:  Base task on channel 2 o Channel Settings → Audio interface for this channel: Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD)  On the Nomad Test Setup screen, configure Channel 2 as a Base task with the following settings: o Session → Downlink Device:  Mobile task on channel 1 o Channel Settings → Audio interface for this channel:   Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD)  Confirm that Channel 1 and Channel 2 use the same Scoring Model.  Both channels must be set for either PESQ or POLQA in the Session dialog.  Place a call from Handset 1 to Handset 2.  The incoming call must be manually answered.  Start a new test.  The test sequence starts with Handset 1 sourcing data to Handset 2.  Data collected at Handset 2 will be replayed in the Channel 2 area of the Nomad interface.  In the next cycle, Handset 2 will source data to Handset 1.  Data collected at Handset 1 will be replayed in the Channel 1 area of the Nomad interface.  This pattern continues throughout the duration of the test.  During testing, adjust volume using the Settings → Levels dialog such that the Ins Gain (PESQ) or Attenuation (POLQA) reading for each channel falls as close to 0 as possible.  The Input and Output sliders should sit within the Normal Operating Range values displayed.  End the test when desired.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    119   Figure 9-4 - Mobile-to-Mobile Testing  Assuming all four channels are available for testing, a second Mobile-to-Mobile test may be performed simultaneously on Channel 3 and Channel 4.  Configure this test following the same instructions as above.  In the Nomad output report generated with Mobile-to-Mobile test data, find results for audio received at Channel 1 in the Channel 1 – DL area.  Find results for audio received at Channel 2 in the Channel 1 – UL area.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    120  9.5 Remote Unit Testing Nomad provides the option for Remote Unit testing.  In this configuration, a handset connected to the local Spirent ME hardware unit sources audio to and receives audio from a handset connected to a second hardware unit.  In many cases, the second hardware unit is located remotely (i.e. in another office or city).  One handset will source speech while the other will record speech for ten seconds.  The process will alternate in the next ten second cycle.  To perform Remote Unit hardware testing:   Connect one handset to Channel 1(ME), NB1 (ME HD) or HD1 (ME HD) on the local Spirent ME or HD ME hardware unit.    Connect the second to Channel 1(ME), NB1 (ME HD) or HD1 (ME HD) on the second Spirent ME or HD ME hardware unit.    If the second hardware unit is located remotely, ask a colleague for assistance in setting up that unit.   Figure 9-5 - Remote Unit Hardware Configuration for a Spirent ME unit   In the local Nomad software, configure Channel 1 as a Mobile task with the following setting: o Session → Uplink Device:  Base task on remote unit  In the remotely located Nomad software, configure Channel 1 as a Base task with the following setting: o Session → Downlink Device:  Mobile task on remote unit  Confirm that both test channels use the same Scoring Model.  Both channels must be set for either PESQ or POLQA in the Session dialog.  If testing is performed using Spirent HD ME hardware, make sure the proper Audio interface (High definition or Narrowband) is selected.  Place a call from Handset 1 to Handset 2.  The incoming call must be manually answered.  Start a new test.  The test sequence starts with the local handset sourcing data to the remote handset.  Data collected at the remote handset will be displayed in Channel 1 of the remote Nomad installation.  In the next cycle, the remote handset will source data to the local handset.  Data collected at the local handset will be displayed in Channel 1 of the local Nomad installation.  During testing, adjust volume using the Settings → Levels dialog such that the Ins Gain (PESQ) or Attenuation (POLQA) reading for each channel falls as close to 0 as possible.  The Input and Output sliders should sit within the Normal Operating Range values displayed.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    121   End the test when desired.  Assuming all four channels are available for testing, up to four Remote Unit test calls may be placed simultaneously.  Spirent Communications recommends aligning calling handsets on the local unit with the receiving handsets on the remote unit.  For example, Handset 1 on the local unit should call Handset 1 on the remote unit; Handset 2 on the local unit should call Handset 2 on the remote unit, etc.  To merge the local and remote data into a single file for report generation:   Obtain the remotely collected log file via email or other file transfer method.  Because Nomad will have access to only the locally collected data (not the data collected remotely), the locally collected file will appear on the Data → Incomplete tab after collection is stopped.  Use the Import Logging Session button to find and open the remotely collected log file.  On the Data → Incomplete tab, select both the locally collected file and the remotely collected file.  Click the Merge Selected button to merge the two files into one file that will appear on the Data → Complete tab and which can be used to generate a formatted output report.  In the formatted output report generated with Remote Unit test data, find results for locally received audio in the Channel 1 – DL area.  Find results for remotely received audio in the Channel 1 – UL area.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    122  9.6 Landline Module Testing The Nomad Landline Module provides an additional option for voice quality testing without needing the Audio Server to handset uplink data collection and downlink audio sourcing.  In the Landline Module configuration, the mobile end consists of the standard test handset and Spirent ME unit connected to a laptop which handles downlink data collection and uplink audio sourcing.  The landline end consists of the Nomad Landline Module hardware with analog phone connections as the communications links and a computer for handling uplink data collection and downlink audio sourcing.   Figure 9-6 - System Configuration for Landline Module Testing  The Nomad setup and configuration for both the mobile and landline end are identical to the Remote Unit testing configuration described in Section 9.3.  The only difference is that each test call is made from the handset at the mobile end to the corresponding analog phone line on the Landline Module.  For the best results, Spirent Communications recommends dialing Line 1 using the handset on Channel 1, Line 2 with Channel 2, etc.  Assuming all four channels are available for testing, up to four Landline Module test calls may be placed simultaneously.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    123  9.7 Base Station Simulator Testing The Rohde & Schwarz CMU200 and the Agilent 8690 are two commercial base station simulators commonly used to simulate radio conditions in a test lab.  An optional Nomad upgrade provides the cables required to interface to these pieces of equipment for degraded channel, noise cancellation and other test scenarios.  This interface allows the test handset to perform both sourcing and recording functions, enabling handset testing without a landline voice server.  Please contact your Spirent Communications representative for additional information on this option.  To configure Nomad to work with a base station simulator:   Load the desired test conditions into the simulator.  Please see the manufacturer’s documentation for simulator configuration details.  Connect the test handset to Channel 1 on the Spirent ME hardware unit or NB1/HB1 on the Spirent HD ME Unit.  Connect the audio cable provided by Spirent Communications from the base station simulator to Channel 2 on the Spirent ME hardware unit or NB2/HB2 on the Spirent HD ME Unit.     Figure 9-7 - Base Station Simulator Hardware Configuration   On the Nomad Test Setup screen, configure Channel 1 as a Mobile task with the following setting: o Session → Uplink Device:  Base task on channel 2  On the Nomad Test Setup screen, configure Channel 2 as a Base task with the following settings: o Session → Downlink Device:  Mobile task on channel 1 o Channel Settings → Input:  205 o Channel Settings → Output:  195  Confirm that both test channels use the same Scoring Model.  Both channels must be set for either PESQ or POLQA in the Session dialog.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    124   If testing is performed using Spirent HD ME hardware, make sure the proper Audio interface (High definition or Narrowband) is selected.  Establish a call between the handset and the base station simulator.  Start a new test.  The test sequence starts with the handset sourcing data to the base station simulator.  Data collected at the simulator will be replayed in the Channel 2 area of the Nomad interface.  In the next cycle, the simulator will source data to the handset.  Data collected by the handset will be replayed in the Channel 1 area of the Nomad interface.  This pattern continues throughout the duration of the test.   Figure 9-8 - Base Station Simulator Testing   During testing, adjust Channel 1 volume using the Settings → Levels dialog.  Adjust the Channel 1 Input level for optimal MOS performance of the Channel 1 (handset) waveform.  Adjust the Channel 1 Output level for optimal MOS performance of the Channel 2 (simulator) waveform.  End the test when desired.  In the formatted Nomad output report generated with base station simulator test data, find results for audio received at the handset in the Channel 1 – DL area.  Find results for audio received at the simulator in the Channel 1 – UL area.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    125  9.8 Head and Torso Simulator A HATS system may be used in conjunction with the test mobile and base station simulator in Nomad testing to realistically simulate the effect of an adult head and torso on voice quality.  The HATS system may be configured with the base station simulator in two different configurations.  In each configuration, MOS results at the handset will be displayed in the Nomad Channel 1 area during testing.  Results at the base station simulator will be displayed in the Channel 2 area.  The following sections describe the hardware configuration and volume settings for the HATS configurations.  All other test procedures follow the steps described for the base station simulator in Section 9.6.  Note:  The Input and Output settings described in this section represent a general test case that has been performed in the Spirent lab.  If you require settings corresponding to specific test cases, please contact your Spirent representative.  9.8.1 HATS Three Channel Configuration The Spirent ME hardware shall be configured as show below for the HATS three channel configuration:   Figure 9-9 - HATS Three Channel Configuration  On the Nomad Test Setup tab, configure the channels as follows:   Channel 1 – Connects to Ear MIC – “Downlink” o Task Type:  Mobile o Session → Uplink Device:  Base task on channel 2 o Channel Settings → Audio interface for this channel:  Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD) o Channel Settings → Input: 210
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    126  o Channel Settings → Output: 100  Channel 2 – Connects to base station simulator – “Uplink” o Task Type:  Base o Session → Downlink Device:  Mobile task on channel 1 o Channel Settings → Audio interface for this channel:  Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD) o Channel Settings → Input: 180 o Channel Settings → Output: 200  Channel 3 – Connects to AMP and HATS mouth speaker – “mouth” o Task Type:  Mobile o Session → Uplink Device:  Base task on channel 4 o Channel Settings → Audio interface for this channel: Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD) o Channel Settings → Input: 100 o Channel Settings → Output: 200  Channel 4 – No hardware connections o Task Type:  Base o Session → Downlink Device:  Mobile task on channel 3 o Channel Settings → Audio interface for this channel: Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD) o Channel Settings → Input: 100 o Channel Settings → Output: 100  Finally, make these additional system adjustments:   Handset Volume: o One level below max  Yamaha Amp Setting: o 20 dB
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    127  9.8.2 HATS Two Channel Configuration The Spirent ME hardware shall be configured as show below for the HATS two channel configuration:   Figure 9-10 - HATS Two Channel Configuration  On the Nomad Test Setup tab, configure the channels as follows:   Channel 1 – Connects to Ear MIC – “Downlink” o Task Type:  Mobile o Session → Uplink Device:  Base task on channel 2 o Channel Settings → Audio interface for this channel:  Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD) o Channel Settings → Input: 210 o Channel Settings → Output: 200  Channel 2 – Connects to base station simulator – “Uplink” o Task Type:  Base o Session → Downlink Device:  Mobile task on channel 1 o Channel Settings → Audio interface for this channel: Analog Interface(ME), Narrowband (ME HD) or High Definition (ME HD) o Channel Settings → Input: 180 o Channel Settings → Output: 200  Finally, make these additional system adjustments:   Handset Volume: o One level below max  Yamaha Amp Setting: o 20 dB
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    128  9.9 Wideband Testing Nomad contains a wideband speech stimulus designed to exercise various wideband AMR codecs enabled on a wideband AMR device under test.  Wideband testing works as follows:   During the audio quality testing process, Nomad injects a wideband speech stimulus into the communication test path.  The degraded speech at the receiving end is compared to the original wideband reference speech.  The MOS calculation is based on this comparison.  Wideband testing is possible in these test configurations:   Mobile-to-Mobile testing  Base Station Simulator testing  Wideband testing is not possible in the Audio Server test configuration due to the narrowband limitation on the PSTN connection to the Audio Server.  In general, mobile-to-landline testing is unsupported for wideband due to this limitation.  To perform wideband testing:   Confirm that all test devices (including test mobiles and base station simulator) and the test network support the wideband codec and have been configured for wideband.  Some devices must be manually set to the wideband codec.  (Nomad ME) On the Nomad Settings → Voice Quality tab, set the Hardware Sample Rate to Wideband (16 kHz).  (Nomad HD ME) On the Nomad Settings → Audio interface tab, select the High Definition option  Configure the Mobile-to-Mobile or Base Station Simulator test as necessary: o See Section 9.3 to configure a Mobile-to-Mobile test. o See Section 9.6 to configure a Base Station Simulator test.  Note that the simulator should initially be configured with the channels simulating ideal yet realistic conditions (i.e. no degradation introduced).  A familiar test device with the appropriate adapter should be selected for confirming the initial setup.  Set the volume of each test handset to one level below the maximum.  Start a new test.  Adjust the Input Levels and Output Levels using the Settings → Levels controls: o For a Mobile-to-Mobile test:  Keep the Output level at a fixed value for both handsets.  Spirent recommends an Output level of 140.  Adjust the Input level for both handsets to obtain an Ins Gain reading between -7 and 0 dB, or an Attenuation reading between 0 and 7 dB. o For a Base Station Simulator test:  For Channel 1:  Adjust the Input level to obtain an Ins Gain reading between -7 and 0 dB, or an Attenuation reading between 0 and 7 dB.  Adjust the Output level to obtain an Ins Gain reading between -12 and 0 dB, or an Attenuation reading between 0 and 12 dB.  For Channel 2:  Set the Input level to 205.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    129   Set the Output level to 195.  End the test when desired. 9.10 Multi-RAB Testing The objective of Multi-RAB testing with Nomad is to compare the voice quality or call performance of a mobile engaged in data services to a device not transferring data.  An Email Campaign may be launched during a Voice Quality or Call Performance task to test Multi-RAB performance.  An Email Campaign tests whether e-mail sent to the phone during a call disrupts performance as compared to phones not receiving e-mail.  To configured a Multi-RAB test in Nomad:   If custom content is desired for the e-mail to be sent to the phone during testing, navigate to Settings → Email Campaigns.  On this tab, check Override default email content and enter the Custom email body content.  If this option is left unchecked, the default content of the system-generated message reads:  “This is an auto-generated e-mail from Spirent Communications, Inc.”   Figure 9-11 - Email Campaign Settings   On the Test Setup tab, configure two Voice Quality or Call Performance tasks as normal.    Leave one mobile as the “control” with no data traffic.  Use the Email Campaign dialog to configure one mobile to receive periodic e-mails during testing: o Select the Initiate email campaign when logging starts option. o Enter an e-mail address accessible to the test mobile in the Recipient email address (To:) field. o Enter the # of emails to send to the mobile. o Enter the Interval between emails in seconds.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    130  o If desired, Generate Test Email to be sent to the mobile device.  Note:  In the event that Nomad becomes inaccessible during an Email Campaign (i.e. due to PC crash, etc.), it is possible to stop e-mail messages from being sent to the handset(s).  Simply reply to any message generated by the Email Campaign to stop unwanted messages from being sent to the phone.   Figure 9-12 - Email Campaign Dialog   Start a new test.  During testing and when analyzing results in the output report, watch for performance differences between the multi-RAB and the control device.  Remember that the goal of this test is to compare the voice quality or call performance of a mobile engaged in data services to a device not transferring data.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    131  9.11 Voice Delay Testing Nomad allows the precise measurement of speech delay between two mobile devices connected to the same Spirent ME unit.  The measurement includes delay introduced by both handsets and the time it takes to traverse the network.    High levels of delay (generally over 250 milliseconds round-trip) may impact typical conversation.  In the presence of high delay levels, normal conversation breaks down, as speakers are likely to interrupt each other and speak over each other during the call.  Additionally, delay can exacerbate annoying echo problems.  Note:  The Delay Task cannot be run with any other tasks.  All other tasks must be removed from the Test Setup screen before configuring a Delay Task.    To perform Voice Delay testing:   Connect one handset to Channel 1(ME) or NB1(HD ME) via audio cable.  Connect the second handset to Channel 2(ME) or NB2(HD ME).   Figure 9-13 - Voice Delay Hardware Configuration   On the Nomad Test Setup screen, configure Channel 1 as a Voice Delay task with the following settings: o Destination Channel:  2 o Cycles:  The desired number of cycles to define the length of the test.  The total sample count for a test will be Cycles multiplied by Samples Per Cycle.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    132   Figure 9-14 - Voice Delay Task Session Dialog   Spirent recommends leaving the default values for the remaining items in the Session dialog.  Maintaining the default parameters ensures consistency across tests for benchmarking purposes.  Definitions of each parameter are provided here for informational purposes: o Alternate channels after each cycle:  With this option checked, the Nomad channel that is sourcing audio will alternate between each cycle. o Samples Per Cycle:  The number of measurements to take for each Voice Delay Task cycle. o Sample Spacing:  The time in milliseconds between each sample.  For example, a test configured for 10 samples per cycle with sample spacing set to 1000 will result in 10 samples collected with 1 second between each sample. o Delay Subtraction Constant:  This value is used to modify the delay results by a fixed number.  This would be utilized if testing were being done on a particular test system where it is known that a process on the test server is introducing a known artificial delay that should not be taken into account in the results. o Enable side tone rejection: In some test configurations, one phone may be used with a special cable to function as both the origination and destination channel by sourcing audio to a server that echoes the audio back.  In these cases, some phones echo back audio with a minimal delay directly to the speaker before sending it to the server.  This setting, in combination with the Side Tone Duration, will ensure that this side tone will not be measured as the delay.  This setting may not be necessary on all phones with side tone present.  Side tone rejection is only enabled on the HD interface. o Side Tone Duration: Duration in ms after start of audio sourcing to ignore received audio as side tone. o Enable audio priming: With this option enabled, Nomad will source 5 seconds of audio to the channel prior to starting each cycle of delay measurements.  This is required in some cases to make sure that the phone is ready to take delay measurements.   Place a call from Handset 1 to Handset 2.  The incoming call must be manually answered.  Start a new test.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    133   The test sequence starts with Handset 1 sourcing audio to Handset 2.  Assuming Alternate channels after each cycle has been selected, in the next cycle Handset 2 will source audio to Handset 1.  The Average Delay and Median Delay will be plotted for each direction on the Delay Times (ms) / Cycle chart.  Detailed statistics for each test cycle are available in the Cycle History area.   Figure 9-15 - Voice Delay Testing  While monitoring the data during testing, be mindful of the following:   Confirm that each individual cycle has completed and that Nomad has reported data.  Confirm that the Delay Times chart updates for each cycle.  The test will stop after the configured number of Cycles has been completed.  Alternatively, use the Stop button to end the test at any time as desired.  Delay test results can be found on the Delay Performance Summary tab of the output report.  Note:  Delay measurements greater than 250 milliseconds round-trip are considered “high” and are generally detectable during standard conversation.
 Nomad User’s Manual  Chapter 9 – Voice Quality Configuration Options Copyright © Spirent Communications, Inc. 2013    134   Figure 9-16 - Delay Performance Summary Report
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    135  10 Call Performance Configuration Options Nomad provides various configuration options for call performance testing.  These options include:   Mobile Originated Testing  Mobile Terminated Testing  Mobile-to-Mobile Testing  This section contains detailed instructions for each call performance configuration option.  10.1 Mobile Originated Testing In a Mobile Originated test, the mobile test device makes calls to the Call Server for call control testing.  To perform Mobile Originated call performance testing:   Set the volume of the test handset to the maximum volume for optimal Audio Verification results.  Determine which channel the test will be conducted on.  All call performance testing operates via Bluetooth connection.  On the Nomad Test Setup screen, configure that channel as a Mobile Originated task with the following settings in the Call Campaign dialog: o Dial/Answer Method: This field will only be visible to customers who have purchased the ability to control devices via a tethered connection.  This drop-down allows those customers to choose between controlling via the Bluetooth connection, or the tethered connection. o Phone Number Settings:  Enter the Number to call as provided by Spirent and the Number calling from. o Call Initiation Mode:    Synchronous Testing – Calls start simultaneously on all devices regardless of call outcomes.  For example, if one device drops a call, it will remain idle until the next time all mobiles are scheduled to start a new call.   Asynchronous Testing – Each device follows its own call sequence without regard to other device progress.  In this mode, if one device drops a call, it will wait for the specified time and then start a new call, even as the other devices continue their first call. o Access Timeout – A call attempt that has not connected within this amount of time will be classified as an Access Timeout event. o Duration – The length of each call in the test sequence.  In a Synchronous campaign, this refers to the total attempt duration including access time and connected time.  In an Asynchronous campaign, this refers to the connected time only. o Wait Time – The amount of idle time between the end of one call in the sequence and the start of the next call. o Attempts – The number of calls to attempt in this task sequence.
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    136   Figure 10-1 - Call Campaign Dialog for a Mobile Originated Task   Establish the Bluetooth connection for the test device.  Click the Start Logging Session button and proceed through the Start Logging Session Wizard.  Calls will automatically be placed from the test handset to the Call Server.  During testing, the call status is displayed in the Call Performance Task Status Window.  Session statistics are displayed in the Call Performance Statistics Window.  See Section 7.1 for details about the display.   Figure 10-2 - Mobile Originated Call Performance Testing
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    137   The test will stop after the configured number of Attempts.  Alternatively, use the Stop button to end the test at any time as desired.  Test results can be found on the Call Performance Summary, Call Initiation, Call Retention, Audio Verification and Device Performance tabs of the output report.
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    138  10.2 Mobile Terminated Testing In a Mobile Terminated test, the Call Server makes calls to the mobile test device for call control testing.  To perform Mobile Terminated call performance testing:   Set the volume of the test handset to the maximum volume for optimal Audio Verification results.  Determine which channel the test will be conducted on.  All call performance testing operates via Bluetooth connection.  On the Nomad Test Setup screen, configure that channel as a Mobile Terminated task with the following settings: o Dial/Answer Method: This field will only be visible to customers who have purchased the ability to control devices via a tethered connection.  This drop-down allows those customers to choose between controlling via the Bluetooth connection, or the tethered connection. o Number to dial:  The mobile phone number to be dialed by the Call Server. o Call Initiation Mode:    Synchronous Testing – Calls start simultaneously on all devices regardless of call outcomes.  For example, if one device drops a call, it will remain idle until the next time all mobiles are scheduled to start a new call.    Note:  Synchronous Testing may not be maintained for Mobile Terminated call campaigns.  Asynchronous Testing – Each device follows its own call sequence without regard to other device progress.  In this mode, if one device drops a call, it will wait for the specified time and then start a new call, even as the other devices continue their first call. o Turn off answering – Turns off automatic answering of incoming calls, to facilitate manual intervention. o Access Timeout – A call attempt that has not connected within this amount of time will be classified as an Access Timeout event. o Duration – The length of each call in the test sequence.  In a Synchronous campaign, this refers to the total attempt duration including access time and connected time.  In an Asynchronous campaign, this refers to the connected time only. o Wait Time – The amount of idle time between the end of one call in the sequence and the start of the next call. o Attempts – The number of calls to attempt in this task sequence.
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    139   Figure 10-3 - Call Campaign Dialog for a Mobile Terminated Task  Establish the Bluetooth connection for the test device.  Click the Start Logging Session button and proceed through the Start Logging Session Wizard.  Calls will automatically be placed from the Call Server to the test handset.  During testing, the call status is displayed in the Call Performance Task Status Window.  Session statistics are displayed in the Call Performance Statistics Window.  See Section 7.1 for details about the display.  Note:  Only those fields for which data is available at the mobile end will be populated in the Call Performance Statistics window during Mobile Terminated testing.  The remaining fields will display as PENDING during the test.  The statistics for these fields will be compiled in the formatted output report generated at the conclusion of testing.  The affected fields are: o Average Access Time o Failed Attempts o No Service o Access Timeout o Dropped (Count)
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    140   Figure 10-4 - Mobile Terminated Call Performance Testing   The test will stop after the configured number of Attempts.  Alternatively, use the Stop button to end the test at any time as desired.  Note:  Losing internet connectivity during a Mobile Terminated Call Performance campaign may cause the Stop button to cease working.  In this case, wait for a call to come to the phone, answer the call and dial 9999.  This will manually stop the Call Server from repeatedly calling the phone through the scheduled end of the test.  Test results can be found on the Call Performance Summary, Call Initiation, Call Retention, Audio Verification and Device Performance tabs of the output report.   Spirent recommends following these best practices for the best results with Mobile Terminated testing:   Confirm that the test handset is properly configured with voicemail.  This will ensure that voicemail is properly detected during testing.  The Access Timeout parameter for a Mobile Terminated call campaign must be set to a larger duration than it takes the phone to go to voicemail.  If Access Timeout is too short, Nomad will fail to recognize a call sent to voicemail.  Instead, such a call will be classified as an Access Timeout before voicemail picks up.  A 45-second Access Timeout is generally sufficient for North American carriers.  Nomad can take up to seven seconds to detect voicemail.  In a Synchronous campaign it is best practice to set Duration at least seven seconds longer than Access Timeout to ensure that no Voicemail events are missed.
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    141  10.3 Mobile-to-Mobile Testing In a Mobile-to-Mobile test, one handset makes calls that are received by a second handset attached to a different channel on the same Spirent ME unit.  To perform Mobile-to-Mobile call performance testing:   Determine which handset will be the originating device and which handset will be the receiving device.  All call performance testing operates via Bluetooth connection.  On the Nomad Test Setup screen, configure Channel 1 as a Mobile-to-Mobile Originate task with the following Call Campaign settings: o Receiving Device:  Receiver on Channel 2. o Number to dial:  The phone number of the receiving test device. o Access Timeout:  A call attempt that has not connected within this amount of time will be classified as an Access Timeout event. o Duration:  The length of each call in the test sequence. o Wait Time:  The amount of idle time between the end of one call and the start of the next call. o Attempts:  The number of calls to attempt in this call sequence.  Note:  All Mobile-to-Mobile call performance tasks are Synchronous.  Therefore, there is no option to select the Call Initiation Mode.   Figure 10-5 - Mobile-to-Mobile Originated Call Campaign Dialog   Channel 2 will automatically be configured as a Mobile-to-Mobile Receive task.  Confirm that the Incoming number in the Call Campaign dialog matches the phone number of the Mobile-to-Mobile Originating device.  The settings here are pulled automatically from the Mobile-to-Mobile Originate task and are not editable here.  These settings will change automatically when changes are made to the Mobile-to-Mobile Originate task.
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    142   Figure 10-6 - Mobile-to-Mobile Terminated Call Campaign Dialog   Establish the Bluetooth connection for both test devices.  Click the Start Logging Session button and proceed through the Start Logging Session Wizard.  Calls will automatically be placed from the originating handset to the receiving handset.  During testing, the call status is displayed in the Call Performance Task Status Window for each device.  Session statistics for both devices are displayed in the Call Performance Statistics Window.  See Section 7.1 for details about the display.   Figure 10-7 - Mobile-to-Mobile Call Performance Testing
 Nomad User’s Manual  Chapter 10 – Call Performance Configuration Options Copyright © Spirent Communications, Inc. 2013    143   When testing is complete, the matching Mobile Originated and Mobile Terminated results are merged together to create a single session file.  Generate a report in the standard manner for this merged session file.  Test results can be found on the Call Performance Summary, Call Initiation, Call Retention, Audio Verification and Device Performance tabs of the output report.  In the report, the Mobile to Mobile “Calling” channel is treated as a Mobile Originated task.  The Mobile to Mobile “Receiving” channel is treated as a Mobile Terminated task.  The statistics reported for the two tasks will likely be similar, with a few key differences:   Call Initiation Statistics – Calls that failed on the MO side will not be represented on the MT side.  Specifically, initiations classified as Failed Attempts, No Service or Access Timeout will be counted on the MO channel but not on the MT channel.  Setup Time – The Setup Time is generally larger for MT calls than for MO calls.  Spirent recommends following this best practice for the best results with Mobile-to-Mobile call performance testing:   The Access Timeout parameter for a Mobile-to-Mobile call campaign must be set to a larger duration than it takes the phone to go to voicemail.  If Access Timeout is too short, Nomad will fail to recognize a call sent to voicemail.  Instead, such a call will be classified as an Access Timeout before voicemail picks up.  A 45-second Access Timeout is generally sufficient for North American carriers.
 Nomad User’s Manual  Appendix A – Glossary Copyright © Spirent Communications, Inc. 2013    144  Appendix A – Glossary General Testing Terms Term Definition Call Performance Task A task designed to measure how well the device performs with regard to call initiation, call retention, Bluetooth performance, signal strength and battery life.  Configured Task A Configured Task saves all of the settings for the currently selected task.  Session Profile A Session Profile defines the tasks and settings for all available test channels.  Task The type of activity to be performed on an available test channel.    Test The arrangement of the available Nomad channels based on the task types and settings available for each.  Voice Quality Task A task designed to measure how speech is perceived by the end user of the test device.   Voice Quality Testing Terms Term Definition Base Task  The test device on a Base channel serves as the uplink device in the mobile communication path.  This option should be selected for:  The uplink end of a Landline Module test.  One of the test handsets in a bi-directional Mobile-to-Mobile test scenario (the other handset will be set as a Mobile).  One of the test handsets in a bi-directional Remote Unit test scenario (the other handset will be set as a Mobile).
 Nomad User’s Manual  Appendix A – Glossary Copyright © Spirent Communications, Inc. 2013    145  Term Definition Mobile Task The test device on a Mobile channel serves as the downlink device in the mobile communication path.  This option is used for:  Standard bi-directional testing using the Audio Server.  A mobile acting as the downlink device in a test using the Landline Module.  One of the test handsets in a bi-directional Mobile-to-Mobile test scenario (the other handset will be set as a Base).   One of the test handsets in a bi-directional Remote Unit test scenario (the other handset will be set as a Base).  Record Only Task Select this option to perform downlink testing only.  With this setting, Nomad will record speech every ten seconds but will not source speech.  This setting can be used to shorten the test cycle time in the event that only downlink data is required.  Source Only Task Select this option to perform uplink testing only.  With this setting, Nomad will send speech every ten seconds but will not record anything.  This setting can be used to shorten the test cycle time in the event that only uplink data is required.   Call Performance Testing Terms Term Definition Asynchronous Testing Each device follows its own call sequence without regard to other device progress.  In this mode, if one device drops a call, it will wait for the specified time and then start a new call, even as other devices continue their first call.  Idle Task The test device makes no calls.  This type of task only reports changes in signal strength and battery level.  Mobile Originated Task  Calls are made from the mobile test device to the Call Server.  Mobile Terminated Task Calls are made from the Call Server to the mobile test device.  Synchronous Testing Calls start simultaneously on all devices regardless of individual call outcomes.  For example, if one device drops a call, it will remain idle until the next time all mobiles are scheduled to start a new call.
 Nomad User’s Manual  Appendix B – Call Performance Events Copyright © Spirent Communications, Inc. 2013    146  Appendix B – Call Performance Events Call Outcome Events Call Outcome Definition No Service  There was no service available when the call was attempted.  Failed Attempt  A Mobile Originated Call was placed and an outgoing call was established followed by the phone’s return to the call placement state.  Voicemail  A Mobile Terminated call was placed and the nonstandard voicemail response was received from the mobile.  Busy  A Mobile Terminated call was placed and the nonstandard busy response was received from the mobile.  A busy response typically results when the network is unable to place the call.  Fast Busy  A Mobile Terminated call was placed and the nonstandard fast busy response was received from the mobile.  Both the fast busy response and the no capacity response typically indicate an issue with the communication path to the server.  No Capacity  A Mobile Terminated call was placed and the nonstandard no capacity response was received from the mobile.  Both the fast busy response and the no capacity response typically indicate an issue with the communication path to the server.  Access Timeout  Mobile Originated Call – A call was placed but no state change occurred before the Access Timeout time.  Mobile Terminated Call – The connection did not take place before the Access Time.  Drop  The call was ended before the expected duration had elapsed.  Physical Link Error  The Bluetooth connection between the test mobile and the Nomad hardware unit was not active at the time a call was supposed to occur.  Logical Device Error  The test mobile failed to respond properly to a command issued by Nomad via the Bluetooth connection.  BC Error (BlueCore Error)  The Bluetooth module was not in a nominal state at the time a call was supposed to occur.  User Terminated  The Stop button on the handset was pressed during a call.  Call Succeeded   Any call that lasts the expected duration is considered a successful call.
 Nomad User’s Manual  Appendix B – Call Performance Events Copyright © Spirent Communications, Inc. 2013    147  Device State Events Device State Event Definition Device State Value  The device may change to any of the following: o Initializing o Ready o Discovering o Connecting o Connected o Outgoing Call Established o Incoming Call Established o Active Call  Signal Strength  The phone signal strength corresponding to the handset bar display (0 – 5)  Battery Life  The phone battery life corresponding to the handset bar display (0 – 5)  RSSI  The RSSI reported by the handset (-113 to -51 dBm); this feature is device-dependent and only available on phones that support RSSI reporting
 Nomad User’s Manual  Appendix C – Confidence Interval Calculation Copyright © Spirent Communications, Inc. 2013    148  Appendix C – Confidence Interval Calculation The Nomad Call Performance output reports employ a 90% Confidence Interval for call initiation failure and dropped call events.  The Confidence Interval allows us to report, with 90% confidence, that the event rate for a given device will fall within the calculated range above and below the measured value.  The Confidence Interval is calculated as a function of the estimated standard deviation of the sample proportion (% dropped calls or % call initiation failures) and the standard normal distribution function.             therefore  
 Nomad User’s Manual  Appendix D – PESQ Tools GUI Case Study Copyright © Spirent Communications, Inc. 2013    149  Appendix D – PESQ Tools GUI Case Study The PESQ Tools GUI is available for Nomad users wishing to perform advanced analysis of any waveform captured during Voice Quality testing and scored using PESQ.  Basic operation of the PESQ Tools GUI is described in Section 8.7.    The PESQ Tools GUI can assist in the identification of issues that contribute to low MOS such as background noise and speech clipping.  This appendix presents a case study illustrating how the PESQ Tools GUI may be used to identify these types of issues in a test file exhibiting low MOS results.  To start, the Reference Waveform, Reference Surface and Reference Spectrogram tabs display what the test waveform should look like.  The Degraded Waveform, Degraded Surface and Degraded Spectrogram tabs display what the waveform actually looks like.  Sample waveforms with low MOS results will contain visible differences from the reference images.  In this example, the Reference Waveform is free of noise during periods without speech while the Degraded Waveform exhibits a constant buzzing during “quiet” periods:   Figure D-1 - Reference Waveform   Figure D-2 - Degraded Waveform  Not only is this buzzing visible in the Degraded Waveform image, it is audible in the .WAV file.  Use the Play   button to listen to the effect of the buzzing on the speech sample.
 Nomad User’s Manual  Appendix D – PESQ Tools GUI Case Study Copyright © Spirent Communications, Inc. 2013    150  The issue of speech clipping is visible when comparing the Reference Surface to the Degraded Surface:   Figure D-3 - Reference Surface   Figure D-4 - Degraded Surface  Speech clipping is also apparent on the Error Surface graph.  The Error Surface represents the Reference Surface minus the Degraded Surface.  Therefore, errors that represent missing signal (i.e. clipping) will have positive values on this chart.  Errors that add to the signal (i.e. noise) will have negative values here.   Figure D-5 - Error Surface
 Nomad User’s Manual  Appendix D – PESQ Tools GUI Case Study Copyright © Spirent Communications, Inc. 2013    151  The Reference and Degraded Signal spectrums available in the PESQ Tools GUI provide another means of analysis.  These figures can be used to visually compare the reference to the degraded frequency response.  Note that although this example exhibits frequency fade above 3.5 kHz, this is a normal response for mobiles calling the Audio Server and is not expected to affect the MOS outcome.   Figure D-6 - Reference and Degraded Signal Spectrums  The buzzing, clipping and frequency response characteristics are all apparent upon comparison of the Reference Spectrogram to the Degraded Spectrogram.  The Spectrogram graphs provide frequency spectra information over the duration of the sample.   Figure D-7 - Reference Spectrogram
 Nomad User’s Manual  Appendix D – PESQ Tools GUI Case Study Copyright © Spirent Communications, Inc. 2013    152   Figure D-8 - Degraded Spectrogram  In conclusion, this example has illustrated how the PESQ Tools GUI may be used to identify specific areas of a speech sample contributing to poor MOS results.  In this case, multiple factors of background buzzing and speech clipping were identified.  Using this information, the source of each issue may now be identified and rectified in order to achieve the desired MOS results.
 Nomad User’s Manual  Appendix E – Introduction to POLQA Copyright © Spirent Communications, Inc. 2013    153  Appendix E – Introduction to POLQA The worldwide prevailing standard for mobile voice quality analysis has been ITU-T P.862, known as Perceptual Evaluation of Speech Quality (PESQ).  PESQ implements automated testing of telecommunications using actual speech samples, comparison of the reference signal (transmitting side) to the degraded channel (listening side), and generation of mean opinion scores (MOS) to model subjective listening patterns.  This technique has been widely adopted due to its capability to automate collections of large sample sets simulating real-world subscriber experience.  As network technologies mature and evolve, the drivers of performance change, and new methodologies for measuring and assuring quality are required.  PESQ is logically being succeeded by ITU-T P.863, Perceptual Objective Listening Quality Analysis (POLQA).  Advantages of POLQA include:   Significantly expanded set of codecs, including AMR-WB, EVRC, EVRC-WB, Skype / SLIK, G.711 and G.729.  Designed to handle more complex end-to-end network architectures and quality management techniques such as smart loss concealment and time stretching.  Two operational modes to distinctly address narrowband and super-wideband communication.  Three-fold increase in evaluation set used compared to PESQ, resulting in considerably smaller residual prediction errors even as the application range has expanded substantially.  Seamless upgrade path from, and backward compatibility with, PESQ.  POLQA provides more robust quality predictions for:   Cross-technology quality benchmarking (such as GSM vs. CDMA)  Noise reduction and voice quality enhancement  Time scaling, unified communication and VoIP  Non-optimal presentation levels  Filtering and spectral shaping  Recordings made at an ear simulator
 Nomad User’s Manual  Appendix E – Introduction to POLQA Copyright © Spirent Communications, Inc. 2013    154  The table below compares PESQ and POLQA at a glance:   PESQ POLQA Codecs  AMR  EFR  AMR  AMR-WB  EFR  EVRC  EVRC-WB  iLBC  AMB+  AAC  Skype / SILK  G.711  G.729  Reference Speech Material  8 kHz  8 kHz  48 kHz  Applications  POTS  VoIP  3G  HD Voice  Voice Enhancement Devices  Skype Calls  Benchmarking CDMA and GSM   POLQA scoring is available as an optional upgrade to Nomad.  To configure a POLQA-enabled Nomad installation for POLQA scoring:   On the Settings→ Voice Quality tab, set the Default Scoring Algorithm to POLQA.  When creating a new Voice Quality task, confirm that the Scoring Model in the Session dialog is set to POLQA.  During a Voice Quality test using POLQA, the Voice Quality Task Status Window displays Attenuation as the amplitude of the current waveform.  Attenuation measures the downlink signal gain reduced by Nomad.  For the best results, Spirent Communications recommends maintaining downlink attenuation between 0 and 7 dB, ideally as close to 0 dB as possible.  Increase Input Level to decrease Attenuation toward 0 dB.  Note that compared to PESQ, POLQA is more forgiving when it comes to attenuation adjustment.  During a Voice Quality test using POLQA, the Voice Quality Task Status Window displays the Algorithm as POLQA.
 Nomad User’s Manual  Appendix E – Introduction to POLQA Copyright © Spirent Communications, Inc. 2013    155   Figure E-0-1 - Voice Quality Task Status Window for a POLQA Test  When generating the output report, Nomad always scores the data using the Scoring Model specified in the test definition.  For example, a test defined with the POLQA scoring model will be scored using POLQA.  Three options exist to score the same data using the other scoring model:   Re-score Files – On the Data → Complete tab, right-click on the file of interest and select Re-score Files.  The scoring method not used on the previous pass will be used.  Batch Scoring – This option scores multiple .WAV files using PESQ, POLQA or both.  The output of this utility is a delimited text file that may be viewed in raw form, opened in Excel or parsed with a script.  This utility can be used to score previously unprocessed files or for re-scoring.  Access the Batch Scoring utility from the Data → Utilities tab.    Offline Scoring – Retrieve and score uplink .WAV files from the Audio Server using either PESQ or POLQA with this utility.  Access the Offline Scoring utility from the Data → Utilities tab.
 Nomad User’s Manual  Appendix E – Introduction to POLQA Copyright © Spirent Communications, Inc. 2013    156  Appendix F – Nomad HD Hardware   Figure F-0-1 – Spirent Communications HD ME Hardware Unit  The Nomad HD hardware allows for connection to a mobile device via one of three interfaces:  Narrowband, High Definition, or Bluetooth.    Important Safety Note Any usage of the equipment in a manner not specified by the manufacturer may impair features related to safety and user protection.  Tout usage de cet équipement, qui n’est pas conforme aux spécifications du manufacturier, peut affecter les fonctions relatives à la sécurité et la protection de l’utilisateur  Nomad HD LEDs Indicator Function L1, L2, L3, L4 Indicates the status of the link over the active interface for the given channel.  For example, when utilizing the Bluetooth interface, the LED indicates the status of the Bluetooth link.  The general behavior is as follows:  Dark – Channel is disabled  Green – Channel is enabled for Narrowband Analog  White – Channel is enabled for Wideband Analog  Cyan (Flashing) – Bluetooth initialization  Cyan (Solid) – Bluetooth initialized  Blue (Flashing) – Bluetooth Pairing “On”  Blue (Solid) – Bluetooth is Paired  Yellow (Flashing) – Outgoing call setup  Yellow (Solid) – Outgoing call in progress  Magenta (Flashing) – Incoming call setup  Magenta (Solid) – Incoming call in
 Nomad User’s Manual  Appendix E – Introduction to POLQA Copyright © Spirent Communications, Inc. 2013    157  progress  Red - Overdrive Pwr When lit, indicates that power is applied to the unit.  Base Unit Physical Interfaces Interface Description HD1, HD2, HD3, HD4 Interface for High Definition Audio NB1, NB2, NB3, NB4 Interface for Narrowband Audio PC USB interface for communication with the PC GPS USB interface for communication with a GPS device.  The interface functions as a normal USB port for the PC. +12V Power supply interface.  Physical Specifications Dimensions (H x W X D) 1 5/8 in x 10 ¼ in x 4 7/8 in Weight 2.2 lbs Case material Aluminum Communication interfaces Bluetooth, Analog, USB  Power Specifications AC operations Requires external AC adapter.  Adapter specifications: Input – 100 to 240 V, 50-60 Hz, 0.4 amps Output – 12 V, 1.0 amps Maximum power usage 10 Watts Maximum heat dissipation 10 Watts  Environmental Requirements Operating temperature 0 to 55°C Storage temperature -20 to 70°C Humidity tolerance 5 to 85% RH at 40°C

Navigation menu