WO2013088453A2 - Analytic tool for customer experience evaluation and network optimization - Google Patents
Analytic tool for customer experience evaluation and network optimization Download PDFInfo
- Publication number
- WO2013088453A2 WO2013088453A2 PCT/IN2012/000789 IN2012000789W WO2013088453A2 WO 2013088453 A2 WO2013088453 A2 WO 2013088453A2 IN 2012000789 W IN2012000789 W IN 2012000789W WO 2013088453 A2 WO2013088453 A2 WO 2013088453A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- testing device
- data
- network
- mobile communication
- communication network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/08—Testing, supervising or monitoring using real traffic
Definitions
- This invention relates to mobile communication networks, and more particularly to improving the quality of mobile communication networks.
- monitoring and trouble shooting is also done based ' on data collected in the field and data from network probes. This supplements information not available in the network call records.
- Performance monitoring is not done based on customer profile.
- the network monitoring is not done separately based on service offerings operator may have for different segments like pre-paid or post paid, mass-market or youth-segment, roaming subscriber and so on.
- the principal object of this invention is a method and system for evaluating customer experience and opportunities for network optimization through a two step process.
- the embodiments herein achieve a method and system for evaluating customer experience and opportunities for network optimization through a two step process.
- FIGS. 1 through 5 where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
- FIG. 2 depicts a test device, according to embodiments as disclosed herein;
- FIG. 3 is a flow chart depicting the process, according to embodiments disclosed herein;
- FIGs. 4a and 4b depict exemplary scenarios, according to embodiments as disclosed herein.
- FIG. 5 depicts an example of a test case, data resulting from the test case and a brief conclusion based on the data, according to embodiments as disclosed herein.
- FIG. 1 depicts a mobile communication network, according to embodiments as disclosed herein.
- the network comprises of a testing device 101 , at least one User Equipment (UE) 102, a Base Station (BS) 103 and a Mobile Station Controller (MSC) 104.
- the testing device 101 is connected to the UE 102 using a suitable means.
- the suitable means may be a wired means (such as a cable and so on) or a wireless means (Bluetooth and so on).
- the UE 102 communicates with the BS 103 through an air interface.
- the BS 103 is connected to the MSC 104 using a suitable means and the MSC 104 is further connected to the mobile communication network.
- the UE 102 and the testing device 101 may be combined as a single device, wherein the combined device performs the functionalities of the testing device 101 and the UE 102.
- the testing device 101 is configured to identify customer experiences/scenarios for testing and create corresponding test cases to test the scenarios.
- the testing device 101 based on the test data created as a result of the running the test cases, over a period of time.
- FIG. 2 depicts a testing device, according to embodiments as disclosed herein.
- the testing device 101 comprises of a selection module 201 , a test management module 202, an analysis module 203 and a database 204.
- the selection module 201 identifies customer scenarios for testing.
- the identified scenarios may be a scenario, which may not be tested or analyzed by the mobile communication network.
- the scenarios may range from voice testing scenarios (voice call with/without CRBT feature, call progress tone, announcement for rejected call, related to announcements, related to ISD dialing, related to abnormal cal l release etc), call waiting scenarios, scenarios related to call forwarding, scenarios related to SMS, scenarios related to data calls and so on.
- voice testing scenarios voice call with/without CRBT feature, call progress tone, announcement for rejected call, related to announcements, related to ISD dialing, related to abnormal cal l release etc
- call waiting scenarios scenarios related to call forwarding
- scenarios related to SMS scenarios related to data calls and so on.
- Exemplary lists of scenarios which may be identified by the selection module 201 have been provided in FIGs. 4a and 4b.
- the test management module 202 on receiving the identified scenarios from the selection module 201 , creates test cases to cover the identified scenarios.
- the test cases cover possible variations in the mobile network arising due to a combination of one or more different customer profiles, operators, geographies, equipment vendors and so on-.
- the test management module 202 also performs the test on the mobile network, according to the developed test cases.
- the test management module 202 may run these tests with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101 .
- the test management module 202 samples the test data produced as a result of running the test cases.
- the data collected by the test management module 202 may be resulting from sampling the content in the bearer channel .
- This sampling may be performed with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101 .
- the data collected by the test management module 202 profiles, operators, geographies, equipment vendors and so on.
- the data collected by the test management module 202 may be stored in the database 205.
- the analysis module 203 receives data from the test management module 202.
- the analysis module 203 may also fetch the data from the database 204.
- the analysis module 203 receives data from the test management module 202.
- the analysis module 203 may also fetch the data from the database 204.
- the analysis module 203 normalizes the data and performs various analytical operations on the data.
- the analysis module 203 may also fetch related data from the network for the purposes of performing the analysis, where the data comprises of network parameters, traffic models, network architecture and so on.
- the analysis module 203 may also use further information such as topographical information, urban layout of the area, density of the area and so on for performing the analysis, which may be fetched from the network or from an available alternate source.
- the analysis module 203 may also perform trending analysis of trends of specific parameters over a period of time, based on data which may be present in the database
- the analysis module 203 also provides reports, based on the analysis.
- the reports may include recommendations to de-congest the mobile communication network, optimizing network parameters, architecture and traffic model for improved efficiencies and so on.
- the analysis module 203 may also generate trending reports of trends of specific parameters over a period of time, based on data which may be present in the database 204 or made available by the mobile network.
- the analysis module 203 may also provide comparative benchmarks for improving experiences like call setup tie or data speeds.
- the analysis module 203 may also assist in identifying opportunities for changes in network architecture, feature or parameter settings.
- the analysis module 203 may also generate comparative results, which may lead to identification of reasons which can lead to changes that help improve experience and resource utilization in the network. operator of the testing device 101 , the mobile network operator or any other interested person in a suitable format.
- FIG. 3 is a flow chart depicting the process, according to embodiments disclosed herein.
- the testing device 101 identifies (301 ) relevant scenarios for testing.
- the identified scenarios may be a scenario, which may not be tested or analyzed by the mobile communication network. Exemplary lists of scenarios which may be identified by the selection module 201 have been provided in FIGs. 4a and 4b.
- the testing device 101 then creates (302) test cases to cover the identified scenarios.
- the test cases cover possible variations in the mobile network arising due to a combination of one or more different customer profiles, operators, geographies, equipment vendors and so on.
- the testing device 101 also performs (303) at least one of the tests on the mobile network, according to the developed test cases.
- the testing device 101 may run these tests with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101.
- the testing device 101 samples (304) the test data produced as a result of the testing device 101 running the test cases.
- the data collected by the testing device 101 may be resulting froni sampling the content in the bearer channel. This sampling may be performed with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101 .
- the testing device 101 normalizes (305) the data and performs (306) various analytical operations on the data, which may use data from the network and other sources. Based on the analysis, the testing device 101 generates (307) various types of reports.
- the reports may include recommendations to de-congest the mobile communication network, improving efficiencies in the network, comparative benchmarks for improving experiences, identifying opportunities for changes in network architecture, feature or parameter settings, generating comparative results and so on.
- the various actions in method in some embodiments, some actions listed in FIG. 3 may be omitted.
- FIG. 5 depicts an example of a test case, data resulting from the test case and a brief conclusion based on the data, according to embodiments as disclosed herein.
- the figure lists the types of tests that were run and a comparison of the results of the tests across various operators. Based on the tests, the analytical tool created recommended benchmarks for the target range of results. The results that do not fall within the target range are highlighted.
- the time range in which an announcement for an unsuccessful call should be, made is 15 to 20 seconds. From Fig. 5, it can be seen that operators 1 , 2, 4 and 6, are not in the required time range.
- the time range in which an announcement for an unanswered call is 30 to 40 seconds. From Fig. 5, it can be seen that operators I , 3, 5. and 6, are not in the required time range.
- Embodiments disclosed herein enable mobile network operators to gain an insight to customer experience and opportunities for optimization of network resources through sampling of data for different experiences and feeding the data into analytical tool to identify opportunities for optimization.
- Embodiments herein disclose sampling of customer experience which provides insight to experiences not monitored in the network. Further, comparison of customer experiences provides opportunities for improvement that could be due to network architecture or parameter settings. These are design issues that cannot be captured through performance monitoring tools.
- Profiling of customer experiences (Call progress, Ring Tone, announcement, One- way speech, echo and cross connect and so on) provides opportunities for optimization for better customer experience or resource management.
- a comparison of experiences not monitored (Accuracy of calling line identification (especially for opportunity for improved network performance or resource, management.
- Embodiments disclosed herein provide comparative benchmarks for improving experiences like call setup tie or data speeds. Also, embodiments herein help in identifying opportunities for changes in network architecture, feature or parameter settings. Further, embodiments herein enable identification of reasons which can lead to changes that help improve experience and resource utilization.
- the embodiments disclosed herein can be implemented through at least one software program , running on at least one hardware device and performing network management functions to control the network elements.
- the network elements shown in Figs. I and 2 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Analytic Tool for Customer Experience Evaluation and Network Optimization. This invention relates to mobile communication networks, and more particularly to improving the quality of mobile communication networks. The principal object of this invention is a method and system for evaluating customer experience and opportunities for network optimization through a two step process. Embodiments herein disclose sampling of customer experience which provides insight to experiences not monitored in the network. Sampling across different test scenarios also provides benchmark references for comparison and improvement in performance/experience. Embodiments disclosed herein enable mobile network operators to gain an insight to customer experience and opportunities for optimization of network resources through sampling of data for different experiences and feeding the data into analytical tool to identify opportunities for optimization.
Description
"ANALYTIC TOOL FOR CUSTOMER EXPERIENCE EVALUATION AND NETWORK OPTIMIZATION
FIELD OF INVENTION
[001 ] This invention relates to mobile communication networks, and more particularly to improving the quality of mobile communication networks.
BACKGROUND OF INVENTION
[002] Currently, communication networks are restricted to monitoring only the call records generated in the system (system providers or OEM' s). The monitoring performed is only of the signaling channel, while the bearer channel is monitored only for quality of the channel; there is no monitoring of the content present in the bearer channel. There are additional network experiences that are still missed out not monitored. The communication networks perform further troubleshooting and network optimization based on the results of the monitoring of the call records.
[003] Further, monitoring and trouble shooting is also done based'on data collected in the field and data from network probes. This supplements information not available in the network call records.
[004] Many opportunities to optimize/improve network (even when there are no faults) are missed out because there is lack of reference benchmarks. Very few parameter indicators have benchmarks.
[005] Performance monitoring is not done based on customer profile. For example the network monitoring is not done separately based on service offerings operator may have for different segments like pre-paid or post paid, mass-market or youth-segment, roaming subscriber and so on.
OBJECT OF INVENTION
[006] The principal object of this invention is a method and system for evaluating customer experience and opportunities for network optimization through a two step process.
[007] These and other aspects of the embodiments herein wi ll be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
STATEMENT OF INVENTION
[008] The embodiments herein achieve a method and system for evaluating customer experience and opportunities for network optimization through a two step process.
[009] Referring now to the drawings, and more particularly to FIGS. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
BRIEF DESCRIPTION OF FIGURES
[0010] This invention is illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
disclosed herein;
[0012] FIG. 2 depicts a test device, according to embodiments as disclosed herein;
[0013] FIG. 3 is a flow chart depicting the process, according to embodiments disclosed herein;
[0014] FIGs. 4a and 4b depict exemplary scenarios, according to embodiments as disclosed herein; and
[0015] FIG. 5 depicts an example of a test case, data resulting from the test case and a brief conclusion based on the data, according to embodiments as disclosed herein.
[0016] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0017] FIG. 1 depicts a mobile communication network, according to embodiments as disclosed herein. The network, as depicted, comprises of a testing device 101 , at least one User Equipment (UE) 102, a Base Station (BS) 103 and a Mobile Station Controller (MSC) 104. The testing device 101 is connected to the UE 102 using a suitable means. The suitable means may be a wired means (such as a cable and so on) or a wireless means (Bluetooth and so on). The UE 102 communicates with the BS 103 through an air interface. The BS 103 is connected to the MSC 104 using a suitable means and the MSC 104 is further connected to the mobile communication network.
[001 8] In an embodiment herein, the UE 102 and the testing device 101 may be combined as a single device, wherein the combined device performs the functionalities of the testing device 101 and the UE 102.
[0019] The testing device 101 is configured to identify customer experiences/scenarios for testing and create corresponding test cases to test the scenarios. The testing device 101 , based on the test data created as a result of the running the test cases,
over a period of time.
[0020] FIG. 2 depicts a testing device, according to embodiments as disclosed herein. The testing device 101 , as depicted, comprises of a selection module 201 , a test management module 202, an analysis module 203 and a database 204.
[0021 ] The selection module 201 identifies customer scenarios for testing. The identified scenarios may be a scenario, which may not be tested or analyzed by the mobile communication network. The scenarios may range from voice testing scenarios (voice call with/without CRBT feature, call progress tone, announcement for rejected call, related to announcements, related to ISD dialing, related to abnormal cal l release etc), call waiting scenarios, scenarios related to call forwarding, scenarios related to SMS, scenarios related to data calls and so on. Exemplary lists of scenarios which may be identified by the selection module 201 have been provided in FIGs. 4a and 4b.
[0022] The test management module 202, on receiving the identified scenarios from the selection module 201 , creates test cases to cover the identified scenarios. The test cases cover possible variations in the mobile network arising due to a combination of one or more different customer profiles, operators, geographies, equipment vendors and so on-. The test management module 202 also performs the test on the mobile network, according to the developed test cases. The test management module 202 may run these tests with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101 . The test management module 202 samples the test data produced as a result of running the test cases. The data collected by the test management module 202 may be resulting from sampling the content in the bearer channel . This sampling may be performed with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101 . The data collected by the test management module 202
profiles, operators, geographies, equipment vendors and so on. The data collected by the test management module 202 may be stored in the database 205.
[0023] The analysis module 203 receives data from the test management module 202. The analysis module 203 may also fetch the data from the database 204. The analysis module
203 normalizes the data and performs various analytical operations on the data. The analysis module 203 may also fetch related data from the network for the purposes of performing the analysis, where the data comprises of network parameters, traffic models, network architecture and so on. The analysis module 203 may also use further information such as topographical information, urban layout of the area, density of the area and so on for performing the analysis, which may be fetched from the network or from an available alternate source. The analysis module 203 may also perform trending analysis of trends of specific parameters over a period of time, based on data which may be present in the database
204 or made available by the mobile network.
[0024] The analysis module 203 also provides reports, based on the analysis. The reports may include recommendations to de-congest the mobile communication network, optimizing network parameters, architecture and traffic model for improved efficiencies and so on. The analysis module 203 may also generate trending reports of trends of specific parameters over a period of time, based on data which may be present in the database 204 or made available by the mobile network. The analysis module 203 may also provide comparative benchmarks for improving experiences like call setup tie or data speeds. The analysis module 203 may also assist in identifying opportunities for changes in network architecture, feature or parameter settings. The analysis module 203 may also generate comparative results, which may lead to identification of reasons which can lead to changes that help improve experience and resource utilization in the network.
operator of the testing device 101 , the mobile network operator or any other interested person in a suitable format.
[0026] FIG. 3 is a flow chart depicting the process, according to embodiments disclosed herein. The testing device 101 identifies (301 ) relevant scenarios for testing. The identified scenarios may be a scenario, which may not be tested or analyzed by the mobile communication network. Exemplary lists of scenarios which may be identified by the selection module 201 have been provided in FIGs. 4a and 4b. The testing device 101 , then creates (302) test cases to cover the identified scenarios. The test cases cover possible variations in the mobile network arising due to a combination of one or more different customer profiles, operators, geographies, equipment vendors and so on. The testing device 101 also performs (303) at least one of the tests on the mobile network, according to the developed test cases. The testing device 101 may run these tests with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101. The testing device 101 samples (304) the test data produced as a result of the testing device 101 running the test cases. The data collected by the testing device 101 may be resulting froni sampling the content in the bearer channel. This sampling may be performed with the assistance of an external UE 102 or UE like functionalities which have been built into the testing device 101 . The testing device 101 normalizes (305) the data and performs (306) various analytical operations on the data, which may use data from the network and other sources. Based on the analysis, the testing device 101 generates (307) various types of reports. The reports may include recommendations to de-congest the mobile communication network, improving efficiencies in the network, comparative benchmarks for improving experiences, identifying opportunities for changes in network architecture, feature or parameter settings, generating comparative results and so on. The various actions in method
in some embodiments, some actions listed in FIG. 3 may be omitted.
[0027] FIG. 5 depicts an example of a test case, data resulting from the test case and a brief conclusion based on the data, according to embodiments as disclosed herein. The figure lists the types of tests that were run and a comparison of the results of the tests across various operators. Based on the tests, the analytical tool created recommended benchmarks for the target range of results. The results that do not fall within the target range are highlighted.
[0028] For example, the time range in which an announcement for an unsuccessful call should be, made is 15 to 20 seconds. From Fig. 5, it can be seen that operators 1 , 2, 4 and 6, are not in the required time range.
[0029] For example, the time range in which an announcement for an unanswered call is 30 to 40 seconds. From Fig. 5, it can be seen that operators I , 3, 5. and 6, are not in the required time range.
[0030] Embodiments disclosed herein enable mobile network operators to gain an insight to customer experience and opportunities for optimization of network resources through sampling of data for different experiences and feeding the data into analytical tool to identify opportunities for optimization.
[003 1 ] Embodiments herein disclose sampling of customer experience which provides insight to experiences not monitored in the network. Further, comparison of customer experiences provides opportunities for improvement that could be due to network architecture or parameter settings. These are design issues that cannot be captured through performance monitoring tools. Profiling of customer experiences (Call progress, Ring Tone, announcement, One- way speech, echo and cross connect and so on) provides opportunities for optimization for better customer experience or resource management. A comparison of experiences not monitored (Accuracy of calling line identification (especially for
opportunity for improved network performance or resource, management.
[0032] Embodiments disclosed herein provide comparative benchmarks for improving experiences like call setup tie or data speeds. Also, embodiments herein help in identifying opportunities for changes in network architecture, feature or parameter settings. Further, embodiments herein enable identification of reasons which can lead to changes that help improve experience and resource utilization.
[0033] The data received by sampling of customer experiences and benchmarking them against similar experiences within or outside network can provide opportunities to mobile network operators to improve customer experience and optimize resource usage and thus operating and capital expenses.
[0034] The embodiments disclosed herein can be i mplemented through at least one software program , running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in Figs. I and 2 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0035] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in
the spirit and scope of the embodiments as described herein.
Claims
1 . A method for a testing device to evaluate performance of a mobile communication network, said method comprising of
Identifying scenarios related to said mobile communication network for testing by said testing device;
Creating test cases corresponding to said identified scenarios by said testing device; Running said test cases on said mobile communication network by said testing device; and
Collecting data resulting from running said test cases by said testing device.
2. The method, as claimed in claim 1 , wherein said identified scenarios are not tested by said mobile communication network.
3. The method, as claimed in claim 1 , wherein said testing device incorporates functions of a User Equipment (UE).
4. The method, as claimed in claim 1 , wherein said testing device is connected to an external UE.
5. The method, as claimed in claim 1 , wherein said testing device analyzes said collected data.
6. The method, as claimed in claim 5, wherein said testing device uses a plurality of information while analyzing said data, wherein said plurality of information comprises of traffic models being used in said mobile communication network;
architecture of said mobile communication network;
topographical information of area being tested;
urban layout of area being tested; and
density of area being tested '
7. The method, as claimed in claim 1 , wherein testing device normalizes said data.
8. The method, as claimed in claim 1 , wherein said testing device generates report based on at least one of
said analysis;
said data; and
previously collected data
9. The method, as claimed in claim 1 , wherein said testing device stores said data.
10. The method, as claimed in claim 1 , wherein said data is used to create benchmarks.
1 1 . A system performing a method as in at least one of preceding method claims 1 to 10.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN3400/MUM/2011 | 2011-12-02 | ||
| IN3400MU2011 | 2011-12-02 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2013088453A2 true WO2013088453A2 (en) | 2013-06-20 |
| WO2013088453A3 WO2013088453A3 (en) | 2013-10-03 |
Family
ID=48613312
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IN2012/000789 Ceased WO2013088453A2 (en) | 2011-12-02 | 2012-12-03 | Analytic tool for customer experience evaluation and network optimization |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2013088453A2 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105263145A (en) * | 2015-10-30 | 2016-01-20 | 中国铁塔股份有限公司齐齐哈尔市分公司 | Method for optimization processing of signal test data in wireless network plan |
| US10467128B2 (en) | 2016-09-08 | 2019-11-05 | International Business Machines Corporation | Measuring and optimizing test resources and test coverage effectiveness through run time customer profiling and analytics |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7224968B2 (en) * | 2001-11-23 | 2007-05-29 | Actix Limited | Network testing and monitoring systems |
| WO2008053316A2 (en) * | 2006-10-30 | 2008-05-08 | Nokia Corporation | Method, apparatus and system for testing user equipment functionality |
| GB0911655D0 (en) * | 2009-07-06 | 2009-08-12 | Omnifone Ltd | Automatic mobile internet gateway configuration interrogation (snake) |
-
2012
- 2012-12-03 WO PCT/IN2012/000789 patent/WO2013088453A2/en not_active Ceased
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105263145A (en) * | 2015-10-30 | 2016-01-20 | 中国铁塔股份有限公司齐齐哈尔市分公司 | Method for optimization processing of signal test data in wireless network plan |
| CN105263145B (en) * | 2015-10-30 | 2018-05-08 | 中国铁塔股份有限公司齐齐哈尔市分公司 | The optimized treatment method of signal testing data in a kind of wireless network planning |
| US10467128B2 (en) | 2016-09-08 | 2019-11-05 | International Business Machines Corporation | Measuring and optimizing test resources and test coverage effectiveness through run time customer profiling and analytics |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2013088453A3 (en) | 2013-10-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP1875728B1 (en) | Methods and apparatus for monitoring voice quality on a wireless communication device | |
| CN102075978B (en) | Voice service user negative perception-based network problem analysis method | |
| CN102256290B (en) | Method for collecting abnormal data of TD-SCDMA (Time Division-Synchronization Code Division Multiple Access) wireless communication network user terminal | |
| US9491285B2 (en) | Technique for performance management in a mobile communications network | |
| CN105357699B (en) | Quality of wireless network monitors system and method | |
| WO2014040633A1 (en) | Identifying fault category patterns in a communication network | |
| CN101925101A (en) | A method and device for user call process information collection and statistical analysis | |
| US20200322820A1 (en) | Determining Wireless Network Performance | |
| WO2017041406A1 (en) | Failure positioning method and device | |
| CN104113869B (en) | A kind of potential report user's Forecasting Methodology and system based on signaling data | |
| CN103841276B (en) | A kind of method that speech quality evaluation is carried out based on intelligent mobile phone platform | |
| CN101888654B (en) | Call quality test method, device and system | |
| CN101547466A (en) | An automatic call-testing system and method used to test the quality of the mobile communication network | |
| Soldani | Means and methods for collecting and analyzing QoE measurements in wireless networks | |
| CN104640138B (en) | A kind of method and device of orientation problem terminal | |
| US7606704B2 (en) | Quality assessment tool | |
| WO2013088453A2 (en) | Analytic tool for customer experience evaluation and network optimization | |
| CN104301916B (en) | Test optimization method, apparatus and system based on mobile intelligent terminal universal card | |
| CN102143017A (en) | Service real-time monitoring method and system | |
| CN108271189A (en) | A kind of quality of service monitoring method and device | |
| CN105227789B (en) | The hold-up interception method and device of a kind of harassing call | |
| US8055201B1 (en) | System and method for providing integrated voice quality measurements for wireless networks | |
| CN102946614A (en) | Wireless network monitoring system and monitoring method | |
| CN104994203A (en) | Conversation quality test system and method | |
| CN101448269A (en) | Method and system for determining call failure factor |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12857522 Country of ref document: EP Kind code of ref document: A2 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12857522 Country of ref document: EP Kind code of ref document: A2 |