Main Content

Test Performance Using Classes

这个例子展示了如何创建一个性能测试t and regression test for thefprintffunction.

Write Performance Test

Consider the following unit (regression) test. You can run this test as a performance test usingrunperf('fprintfTest')instead ofruntests('fprintfTest').

classdeffprintfTest < matlab.unittest.TestCasepropertiesfile fidendmethods(TestMethodSetup)functionopenFile(testCase) testCase.file = tempname; testCase.fid = fopen(testCase.file,'w'); testCase.assertNotEqual(testCase.fid,-1,'IO Problem') testCase.addTeardown(@delete,testCase.file); testCase.addTeardown(@fclose,testCase.fid);endendmethods(Test)functiontestPrintingToFile(testCase) textToWrite = repmat('abcdef',1,5000000); fprintf(testCase.fid,'%s',textToWrite); testCase.verifyEqual(fileread(testCase.file),textToWrite)endfunctiontestBytesToFile(testCase) textToWrite = repmat('tests_',1,5000000); nbytes = fprintf(testCase.fid,'%s',textToWrite); testCase.verifyEqual(nbytes,length(textToWrite))endendend

The measured time does not include the time to open and close the file or the assertion because these activities take place inside aTestMethodSetupblock, and not inside aTestblock. However, the measured time includes the time to perform the verifications. Best practice is to measure a more accurate performance boundary.

Create a performance test in a file,fprintfTest.m, in your current working folder. This test is similar to the regression test with the following modifications:

  • The test inherits frommatlab.perftest.TestCaseinstead ofmatlab.unittest.TestCase.

  • 测试电话startMeasuringandstopMeasuringmethods to create a boundary around thefprintffunction call.

classdeffprintfTest < matlab.perftest.TestCasepropertiesfile fidendmethods(TestMethodSetup)functionopenFile(testCase) testCase.file = tempname; testCase.fid = fopen(testCase.file,'w'); testCase.assertNotEqual(testCase.fid,-1,'IO Problem') testCase.addTeardown(@delete,testCase.file); testCase.addTeardown(@fclose,testCase.fid);endendmethods(Test)functiontestPrintingToFile(testCase) textToWrite = repmat('abcdef',1,5000000); testCase.startMeasuring(); fprintf(testCase.fid,'%s',textToWrite); testCase.stopMeasuring(); testCase.verifyEqual(fileread(testCase.file),textToWrite)endfunctiontestBytesToFile(testCase) textToWrite = repmat('tests_',1,5000000); testCase.startMeasuring(); nbytes = fprintf(testCase.fid,'%s',textToWrite); testCase.stopMeasuring(); testCase.verifyEqual(nbytes,length(textToWrite))endendend

The measured time for this performance test includes only the call tofprintf, and the testing framework still evaluates the qualifications.

Run Performance Test

Run the performance test. Depending on your system, you might see warnings that the performance testing framework ran the test the maximum number of times, but did not achieve a0.05relative margin of error with a0.95confidence level.

results = runperf('fprintfTest')
Running fprintfTest .......... .......... . Done fprintfTest __________ results = 1×2 TimeResult array with properties: Name Valid Samples TestActivity Totals: 2 Valid, 0 Invalid. 4.1417 seconds testing time.

Theresultsvariable is a1-by-2TimeResult数组中。数组中每个元素对应e of the tests defined in the test file.

Display Test Results

Display the measurement results for the first test. Your results might vary.

results(1)
ans = TimeResult with properties: Name: 'fprintfTest/testPrintingToFile' Valid: 1 Samples: [4×7 table] TestActivity: [8×12 table] Totals: 1 Valid, 0 Invalid. 2.7124 seconds testing time.

As indicated by the size of theTestActivityproperty, the performance testing framework collected8measurements. This number includes4measurements to warm up the code. TheSamplesproperty excludes warm-up measurements.

Display the sample measurements for the first test.

results(1).Samples
ans = 4×7 table Name MeasuredTime Timestamp Host Platform Version RunIdentifier ______________________________ ____________ ____________________ ___________ ________ __________________________________________ ____________________________________ fprintfTest/testPrintingToFile 0.067729 24-Jun-2019 16:22:09 MY-HOSTNAME win64 9.7.0.1141441 (R2019b) Prerelease Update 2 62991eef-5570-47b0-ade5-b8a805245e8f fprintfTest/testPrintingToFile 0.067513 24-Jun-2019 16:22:09 MY-HOSTNAME win64 9.7.0.1141441 (R2019b) Prerelease Update 2 62991eef-5570-47b0-ade5-b8a805245e8f fprintfTest/testPrintingToFile 0.068737 24-Jun-2019 16:22:09 MY-HOSTNAME win64 9.7.0.1141441 (R2019b) Prerelease Update 2 62991eef-5570-47b0-ade5-b8a805245e8f fprintfTest/testPrintingToFile 0.068576 24-Jun-2019 16:22:10 MY-HOSTNAME win64 9.7.0.1141441 (R2019b) Prerelease Update 2 62991eef-5570-47b0-ade5-b8a805245e8f

Compute Statistics for Single Test Element

Display the mean measured time for the first test. To exclude data collected in the warm-up runs, use the values in theSamplesfield.

sampleTimes = results(1).Samples.MeasuredTime; meanTest = mean(sampleTimes)
meanTest = 0.0681

Compute Statistics for All Test Elements

Determine the average time for all the test elements. ThefprintfTesttest includes two different methods. Compare the time for each method (test element).

Since the performance testing framework returns aSamplestable for each test element, concatenate all these tables into one table. Then group the rows by test elementName, and compute the meanMeasuredTimefor each group.

fullTable = vertcat(results.Samples); summaryStats = varfun(@mean,fullTable,...'InputVariables','MeasuredTime','GroupingVariables','Name')
summaryStats = 2×3 table Name GroupCount mean_MeasuredTime ______________________________ __________ _________________ fprintfTest/testPrintingToFile 4 0.068139 fprintfTest/testBytesToFile 9 0.071595

Both test methods write the same amount of data to a file. Therefore, some of the difference between the mean values is attributed to calling thefprintffunction with an output argument.

Change Statistical Objectives and Rerun Tests

Change the statistical objectives defined by therunperffunction by constructing and running a time experiment. Construct a time experiment with measurements that reach a sample mean with a3%relative margin of error within a97%confidence level. Collect4热身测量和16sample measurements.

Construct an explicit test suite.

suite = testsuite('fprintfTest');

Construct a time experiment with a variable number of sample measurements, and run the tests.

importmatlab.perftest.TimeExperimentexperiment = TimeExperiment.limitingSamplingError('NumWarmups',4,...'MaxSamples',16,'RelativeMarginOfError',0.03,'ConfidenceLevel',0.97); resultsTE = run(experiment,suite);
Running fprintfTest .......... ..........Warning: Target Relative Margin of Error not met after running the MaxSamples for fprintfTest/testPrintingToFile......... Done fprintfTest __________

In this example output, the performance testing framework is not able to meet the stricter statistical objectives with the specified number of maximum samples. Your results might vary.

Compute the statistics for all the test elements.

fullTableTE = vertcat(resultsTE.Samples); summaryStatsTE = varfun(@mean,fullTableTE,...'InputVariables','MeasuredTime','GroupingVariables','Name')
summaryStatsTE = 2×3 table Name GroupCount mean_MeasuredTime ______________________________ __________ _________________ fprintfTest/testPrintingToFile 16 0.069482 fprintfTest/testBytesToFile 4 0.067902

Increase the maximum number of samples to32and rerun the time experiment.

experiment = TimeExperiment.limitingSamplingError('NumWarmups',4,...'MaxSamples',32,'RelativeMarginOfError',0.03,'ConfidenceLevel',0.97); resultsTE = run(experiment,suite);
Running fprintfTest .......... ...... Done fprintfTest __________

Compute the statistics for all the test elements.

fullTableTE = vertcat(resultsTE.Samples); summaryStatsTE = varfun(@mean,fullTableTE,...'InputVariables','MeasuredTime','GroupingVariables','Name')
summaryStatsTE = 2×3 table Name GroupCount mean_MeasuredTime ______________________________ __________ _________________ fprintfTest/testPrintingToFile 4 0.067228 fprintfTest/testBytesToFile 4 0.067766

The testing framework achieves the statistical objectives for both tests with4samples.

Measure First-Time Cost

Start a new MATLAB®session. A new session ensures that MATLAB has not run the code contained in your tests.

Measure the first-time cost of your code by creating and running a fixed time experiment with zero warm-up measurements and one sample measurement.

Construct an explicit test suite. Since you are measuring the first-time cost of the function, run a single test. To run multiple tests, save the results and start a new MATLAB session between tests.

suite = testsuite('fprintfTest/testPrintingToFile');

Construct and run the time experiment.

importmatlab.perftest.TimeExperimentexperiment = TimeExperiment.withFixedSampleSize(1); results = run(experiment,suite);
Running fprintfTest . Done fprintfTest __________

Display the results. Observe theTestActivitytable to ensure there are no warm-up samples.

fullTable = results.TestActivity
fullTable = 1×12 table Name Passed Failed Incomplete MeasuredTime Objective Timestamp Host Platform Version TestResult RunIdentifier ______________________________ ______ ______ __________ ____________ _________ ____________________ ___________ ________ __________________________________________ ________________________________ ____________________________________ fprintfTest/testPrintingToFile true false false 0.071754 sample 24-Jun-2019 16:31:27 MY-HOSTNAME win64 9.7.0.1141441 (R2019b) Prerelease Update 2 [1×1 matlab.unittest.TestResult] 045394eb-e722-4241-8da2-1d17a97ac90a

The performance testing framework collects one sample for each test.

See Also

|||||