perfcompare.rst (5746B)
1 ============= 2 PerfCompare 3 ============= 4 5 .. contents:: 6 :depth: 5 7 :local: 8 9 PerfCompare is an improved performance comparison tool that will soon replace Perfherder’s Compare View. It allows comparisons of up to three **new** revisions/patches versus the **base** revision of a repository (mozilla-central, autoland, etc). Up to three **new** revisions compared to the **base** repository’s history over time can be selected. The two comparison workflows lead to results indicating whether patches have caused an improvement or regression. The following documentation captures the app’s features and workflows in more detail. 10 11 Where can I find PerfCompare? 12 ============================== 13 14 Aside from `the perf.compare website <https://perf.compare/>`_, it will be accessible on Perfherder’s Compare View search and results pages. 15 16 The source code can be viewed in GitHub's `repository <https://github.com/mozilla/perfcompare>`_. 17 18 Home / Search Page 19 ==================== 20 21 Landing on PerfCompare, two search comparison workflows are available: **Compare with a base** or **Compare over time**. 22 23 Compare with a base 24 -------------------- 25 26 .. image:: ./perfcomparehomescreen.png 27 :alt: PerfCompare Interface with Three Selected Revisions to Compare with a Base 28 :scale: 50% 29 :align: center 30 31 PerfCompare allows up to three **new** revisions to compare against a **base** revision. The specific testing framework or harness can also be selected. 32 33 Compare over time 34 ------------------ 35 36 It’s also possible to select up to three revisions to compare against a base repository’s history over a specified period. 37 38 .. image:: ./compareovertime.png 39 :alt: PerfCompare Selection Interface for Revisions/Pushes to Compare over Time 40 :scale: 50% 41 :align: center 42 43 Results Page 44 ============= 45 46 After pressing the Compare button, the Results Page displays the information of the selected revisions and the results table. 47 48 Edit the compared revisions 49 ---------------------------- 50 51 The compared revisions can be edited, and a new comparison can be computed for an updated results table without having to return to the home page. Clicking the **Edit entry** button will open the edit view. 52 53 .. image:: ./resultseditentry.png 54 :alt: PerfCompare Results Page Edit Entry Selection 55 :scale: 50% 56 :align: center 57 58 In the edit view, it’s possible to search for revisions or delete selected revisions. The option to cancel and return to the previous selections is available. Otherwise, once satisfied with the changes, clicking **Compare** will update the data in the results table. 59 60 .. image:: ./resultseditentryviewbase.png 61 :alt: PerfCompare Results Page Compare with a Base Edit Entry View 62 :scale: 50% 63 :align: center 64 65 Like Compare with a base, clicking **Edit Entry** will open the edit view to change selections for the base repository, time range or to delete or search for new selected revisions. 66 67 .. image:: ./resultseditentryviewtime.png 68 :alt: PerfCompare Results Page Compare over Time Edit Entry View 69 :scale: 50% 70 :align: center 71 72 Results Table 73 =============== 74 75 Please refer to the `Understanding the Results <standard-workflow.html#understanding-the-results>`_ section of the Compare View documentation for information on interpreting the results table. 76 77 It’s possible to search the results table by platform, title, or revisions. Other frameworks can be selected to see the results in a different test harness. The **All revisions** dropdown provides options to see the results according to a specific new revision. 78 79 .. image:: ./resultstable.png 80 :alt: PerfCompare Results Table 81 :scale: 50% 82 :align: center 83 84 The **Download JSON** button generates a JSON output of the results data. 85 86 The results table can be filtered according to Platforms, Status (No Changes, Improvement, or Regression), or Confidence (Low, Medium, High). 87 88 .. image:: ./resultstablefilters.png 89 :alt: PerfCompare Results Table with Filters 90 :scale: 50% 91 :align: center 92 93 Expanded Rows 94 -------------- 95 96 Clicking on the **the caret-down** button expands the row 97 98 .. image:: ./resultstableexpanded.png 99 :alt: PerfCompare Results Table with Expanded Row 100 :scale: 50% 101 :align: center 102 103 In the expanded view, hovering over the points or curve on the graphs shows more information about it. 104 105 .. image:: ./resultstableexpandedgraph.png 106 :alt: PerfCompare Results Table with Hover Over The Graph 107 :scale: 50% 108 :align: center 109 110 Subtests 111 --------- 112 113 When such data is available, clicking on the **subtest icon** opens a new page containing the information about the subtests for the selected result 114 115 .. image:: ./resultstablesubtests.png 116 :alt: PerfCompare Results Table with Subtests View 117 :scale: 50% 118 :align: center 119 120 Graph view 121 ----------- 122 123 Clicking on the **graph icon** opens the graph of the historical data or graph view for the job in a new window on Treeherder. 124 125 .. image:: ./resultstableexpandedgraph.png 126 :alt: PerfCompare Results Table with Graph View 127 :scale: 50% 128 :align: center 129 130 Here is an example of the graph view after clicking this icon: 131 132 .. image:: ./resultstablegraphviewperfherder.png 133 :alt: Historical Graph Data on Perfherder 134 :scale: 50% 135 :align: center 136 137 Retrigger test jobs 138 =================== 139 It’s possible to retrigger jobs within Taskcluster. Clicking on the **retrigger icon** will show a dialog to choose how many new runs should be started. Note that signing in with valid taskcluster credentials is required. 140 141 .. image:: ./resultstableretrigger.png 142 :alt: PerfCompare Results Table with Taskcluster Login 143 :scale: 50% 144 :align: center 145 146 .. image:: ./resultstableretriggerjobs.png 147 :alt: PerfCompare Results Table with Retrigger Jobs Dialog 148 :scale: 50% 149 :align: center