public inbox for gcc-cvs@sourceware.org
help / color / mirror / Atom feed
* [gcc(refs/vendors/microsoft/heads/main)] Add workflows to update from master and mirror vendor branches (#50)
@ 2021-05-27  1:20 Eugene Rozenfeld
  0 siblings, 0 replies; only message in thread
From: Eugene Rozenfeld @ 2021-05-27  1:20 UTC (permalink / raw)
  To: gcc-cvs

https://gcc.gnu.org/g:e8d1dbe847c19df0a3aea9aa611eacaf6a250ab5

commit e8d1dbe847c19df0a3aea9aa611eacaf6a250ab5
Author: Victor Tong <53017530+vitong@users.noreply.github.com>
Date:   Tue May 25 09:32:34 2021 -0700

    Add workflows to update from master and mirror vendor branches (#50)
    
    This change introduces two new workflows:
    
    update-main:
    
    Takes the latest master, starts a build and test run and gathers what tests are failing. Then it creates a new branch in our GitHub repository off of our microsoft/main GCC vendor branch, adds the failing tests to the contrib\testsuite-management\x86_64-pc-linux-gnu.xfail file and merges the latest master into it. It then kicks off a build and test-gcc workflow instance to verify that all gcc tests are still passing. Then, it generates a list of git commands to use that someone can use to manually check these changes into the vendor branch.
    update-mirror-branches:
    
    Mirrors the Microsoft main and 9.1.0 vendor branches in the GCC repository into our GitHub repository as the "current" and "gcc-9.1.0" branches.
    We will need a workflow to cleanup stale branches created by update-main. I created an issue to look into this: https://github.com/microsoft/gcc/issues/52

Diff:
---
 .github/actions/print-gcc-testset/action.yaml     |  15 +
 .github/actions/pull-master-upstream/action.yaml  |  25 ++
 .github/actions/setup-vendor-branches/action.yaml |  27 ++
 .github/actions/test-composite/action.yaml        |   7 +-
 .github/scripts/checkout-refs.sh                  |   5 +
 .github/scripts/common.py                         |  28 +-
 .github/scripts/config.py                         |  35 ++-
 .github/scripts/configure-gcc.sh                  |   5 +-
 .github/scripts/downloadBuildArtifact.py          | 171 ++++++++---
 .github/scripts/gccWorkflow.py                    | 115 +++++--
 .github/workflows/build.yaml                      |  14 +-
 .github/workflows/fetch-rebase-test.yaml          |  23 +-
 .github/workflows/test-gcc.yaml                   |  43 ++-
 .github/workflows/update-main.yaml                | 354 ++++++++++++++++++++++
 .github/workflows/update-mirror-branches.yaml     |  61 ++++
 15 files changed, 821 insertions(+), 107 deletions(-)

diff --git a/.github/actions/print-gcc-testset/action.yaml b/.github/actions/print-gcc-testset/action.yaml
new file mode 100644
index 00000000000..cbfc44a5af2
--- /dev/null
+++ b/.github/actions/print-gcc-testset/action.yaml
@@ -0,0 +1,15 @@
+# Composite yaml that prints the GCC test set so it can be used as a YAML matrix variable for running tests
+outputs:
+  testSet:
+    description: 'GCC test set in JSON representation'
+    value: ${{ steps.printGCCTestSet.outputs.testSet }}
+runs:
+  using: "composite"
+  steps:
+    - name: Print GCC test set
+      id: printGCCTestSet
+      run: | 
+        chmod +x .github/scripts/gccWorkflow.py
+        export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+        python -c 'import sys; from gccWorkflow import *; GccWorkflow.PrintTestSet()'
+      shell: bash
diff --git a/.github/actions/pull-master-upstream/action.yaml b/.github/actions/pull-master-upstream/action.yaml
new file mode 100644
index 00000000000..cf061030a17
--- /dev/null
+++ b/.github/actions/pull-master-upstream/action.yaml
@@ -0,0 +1,25 @@
+# Composite yaml to add the upstream gcc remote and checkout from their master branch
+inputs:
+  masterRef:
+    description: 'Reference off of gcc/master to sync to'
+    required: false
+    default: 'gcc/master'
+  scriptsRef:
+    description: 'Commit reference to checkout .github scripts from'
+    required: true
+    default: ''
+
+runs:
+  using: "composite"
+  steps: 
+    - name: Add remote and checkout from ${{ inputs.masterRef }}
+      run: |
+        git remote add gcc git://gcc.gnu.org/git/gcc.git
+        git fetch gcc master
+        chmod +x .github/scripts/checkout-refs.sh
+        .github/scripts/checkout-refs.sh ${MASTER_REF} ${SCRIPTS_REF}
+        git log -1
+      shell: bash
+      env:
+        SCRIPTS_REF: ${{ inputs.scriptsRef }}
+        MASTER_REF: ${{ inputs.masterRef }}
\ No newline at end of file
diff --git a/.github/actions/setup-vendor-branches/action.yaml b/.github/actions/setup-vendor-branches/action.yaml
new file mode 100644
index 00000000000..1874cd6f16b
--- /dev/null
+++ b/.github/actions/setup-vendor-branches/action.yaml
@@ -0,0 +1,27 @@
+# Composite yaml to add the upstream gcc remote, setup the vendor branches and checkout the specified vendor branch
+inputs:
+  vendorRef:
+    description: 'Reference off of gcc/master to sync to'
+    required: true
+    default: ''
+  scriptsRef:
+    description: 'Commit reference to checkout .github scripts from'
+    required: true
+    default: ''
+
+runs:
+  using: "composite"
+  steps: 
+    - name: Add remote and checkout from vendor branch
+      run: |
+          git remote add gcc git://gcc.gnu.org/git/gcc.git
+          echo -e "dummyName\ndummyEmail\ngcc\ndummyName\nme\nyes" > dummyGitCustomization.txt
+          cat "dummyGitCustomization.txt" | contrib/gcc-git-customization.sh
+          contrib/git-fetch-vendor.sh microsoft
+          git branch -a
+          chmod +x .github/scripts/checkout-refs.sh
+          .github/scripts/checkout-refs.sh ${VENDOR_REF} ${SCRIPTS_REF}
+      shell: bash
+      env:
+        SCRIPTS_REF: ${{ inputs.scriptsRef }}
+        VENDOR_REF: ${{ inputs.vendorRef }}
diff --git a/.github/actions/test-composite/action.yaml b/.github/actions/test-composite/action.yaml
index b6f8b458fc5..1fabb8e5d18 100644
--- a/.github/actions/test-composite/action.yaml
+++ b/.github/actions/test-composite/action.yaml
@@ -16,10 +16,15 @@ inputs:
     description: 'Whether the build is already available on the machine this composite action is running on'
     required: true
     default: true
+outputs:
+  configJson:
+    description: 'Config object that may have been updated'
+    value: ${{ steps.runTests.outputs.noSecretConfigJson }}
 runs:
   using: "composite"
-  steps: 
+  steps:
     - name: Download build and run tests
+      id: runTests
       run: | 
         chmod +x .github/scripts/gccWorkflow.py
         echo "$PYTHONPATH"
diff --git a/.github/scripts/checkout-refs.sh b/.github/scripts/checkout-refs.sh
new file mode 100644
index 00000000000..1d91b895a12
--- /dev/null
+++ b/.github/scripts/checkout-refs.sh
@@ -0,0 +1,5 @@
+# $1 is the reference to use for checking out GCC sources
+# $2 is the reference to use for checking out the .github scripts
+git checkout $1 -f
+git fetch origin +refs/pull/*:refs/remotes/origin/refs/pull/*
+git checkout $2 -- .github/
\ No newline at end of file
diff --git a/.github/scripts/common.py b/.github/scripts/common.py
index 85ff1a6d321..2cbb101abf6 100644
--- a/.github/scripts/common.py
+++ b/.github/scripts/common.py
@@ -1,5 +1,8 @@
 import globals
 import logging
+import subprocess
+import requests
+import time
 
 # Exception class to raise when a basic workflow error happens
 class WorkflowError(Exception):
@@ -19,4 +22,27 @@ def RaiseWorkflowError(error):
 
 # This name needs to match what's in the build.yaml file
 def GetGccBuildName():
-    return 'build'
\ No newline at end of file
+    return 'build'
+
+def GetandPrintCurrentSHA():
+    res = subprocess.run('git rev-parse HEAD', shell=True, check=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+
+    # Output for Github to pick up output for future jobs
+    print("::set-output name=currentSHA::" + str(res.stdout, 'utf-8'))
+
+def SendGetRestCmd(restCmd, restArgs, restHeader):
+    logger = GetLogger()
+    retryCount = 10
+    while True:
+        res = requests.get(restCmd, params=restArgs, headers=restHeader)
+        if res.ok:
+            return res
+
+        # Retry the command after a sleep
+        sleepTimeSeconds = 30
+        logger.error("Sleeping for " + str(sleepTimeSeconds) + " seconds before retrying " + restCmd)
+        time.sleep(sleepTimeSeconds)
+
+        retryCount = retryCount - 1
+        if retryCount == 0:
+            RaiseWorkflowError("Max retry count hit with " + restCmd)
\ No newline at end of file
diff --git a/.github/scripts/config.py b/.github/scripts/config.py
index 96e81a3fc8d..a1ddbf4b1bf 100644
--- a/.github/scripts/config.py
+++ b/.github/scripts/config.py
@@ -3,6 +3,7 @@ from json import JSONEncoder
 import globals
 from common import *
 from downloadBuildArtifact import *
+import sys
 
 # subclass JSONEncoder to convert the Config object into JSON
 class ConfigEncoder(JSONEncoder):
@@ -11,10 +12,13 @@ class ConfigEncoder(JSONEncoder):
 
 class Config(object):
 
-    def __init__(self, sha, token, buildArtifactID):
-        self.commitSHA = sha                         # Commit SHA for the checkin
-        self.accessToken = token                     # Access token for REST APIs
-        self.gccBuildArtifactID = buildArtifactID    # GCC Build artifact ID
+    def __init__(self, sha, token, buildArtifactID, workflowRunID, failedWorkflowErrorStr, failedWorkflowReturnCode):
+        self.commitSHA = sha                              # Commit SHA for the checkin
+        self.accessToken = token                          # Access token for REST APIs
+        self.gccBuildArtifactID = buildArtifactID         # GCC Build artifact ID
+        self.runID = workflowRunID                        # Run ID of workflow (not job) that ran Setup()
+        self.failedWorkflowError = failedWorkflowErrorStr # Whether or not the workflow failed
+        self.failedReturnCode = failedWorkflowReturnCode  # Failed workflow error code
         logger = GetLogger()
         logger.info('creating an instance of Config')
 
@@ -30,6 +34,8 @@ class Config(object):
         commit = githubJson["sha"]
         if ("pull_request" in githubJson["event"]):
             commit = githubJson["event"]["pull_request"]["head"]["sha"]
+
+        runID = githubJson["run_id"]
                 
         logger.info("SHA = " + commit)
         workflowName = githubJson["workflow"]
@@ -41,7 +47,7 @@ class Config(object):
             gccBuildArtifactID = 0
 
         # Construct Config object
-        newConfig = Config(commit, accessToken, gccBuildArtifactID)
+        newConfig = Config(commit, accessToken, gccBuildArtifactID, runID, '', 0)
         configJson = json.dumps(newConfig, cls=ConfigEncoder)
 
         # Output for Github to pick up output for future steps
@@ -51,12 +57,10 @@ class Config(object):
         globals.configObj = newConfig
 
     @staticmethod
-    def PrintNoSecretConfigJson(configJson):
+    def PrintNoSecretConfigJson():
         # GitHub won't allow the printing of strings with secrets in them for future jobs
         # so we need to clear out any secrets in our config object prior to printing it
 
-        Config.Reload(configJson)
-
         # Clear out the access token field
         globals.configObj.accessToken = globals.configObj.accessToken.replace(globals.configObj.accessToken, '')
         configJson = json.dumps(globals.configObj, cls=ConfigEncoder)
@@ -64,6 +68,11 @@ class Config(object):
         # Output for Github to pick up output for future jobs
         print("::set-output name=noSecretConfigJson::" + configJson)
 
+    @staticmethod
+    def PrintNoSecretConfigJsonFromJson(configJson):
+        Config.Reload(configJson)
+        Config.PrintNoSecretConfigJson()
+
     @staticmethod
     def Reload(configJson, accessToken=""):
         loadedJson = json.loads(configJson)
@@ -74,4 +83,12 @@ class Config(object):
             accessToken=loadedJson["accessToken"]
 
         # Set global config object
-        globals.configObj = Config(loadedJson["commitSHA"], accessToken, loadedJson["gccBuildArtifactID"])
+        globals.configObj = Config(loadedJson["commitSHA"], accessToken, loadedJson["gccBuildArtifactID"], loadedJson["runID"], loadedJson["failedWorkflowError"], loadedJson["failedReturnCode"])
+        
+    @staticmethod
+    def RaiseErrorIfWorkflowFailed():
+        if (globals.configObj.failedWorkflowError):
+            logger = GetLogger()
+            logger.error(globals.configObj.failedWorkflowError)
+            logging.shutdown()
+            sys.exit(globals.configObj.failedReturnCode)
diff --git a/.github/scripts/configure-gcc.sh b/.github/scripts/configure-gcc.sh
index 319eae9536e..b91087f8297 100644
--- a/.github/scripts/configure-gcc.sh
+++ b/.github/scripts/configure-gcc.sh
@@ -16,7 +16,4 @@ sudo update-alternatives --set c++ /usr/bin/g++
 
 sudo apt install -y texinfo
 sudo apt-get install -y dejagnu
-./contrib/download_prerequisites
-
-cd ..
-mkdir objdir
\ No newline at end of file
+./contrib/download_prerequisites
\ No newline at end of file
diff --git a/.github/scripts/downloadBuildArtifact.py b/.github/scripts/downloadBuildArtifact.py
index 0406442b479..c6a9a926b11 100644
--- a/.github/scripts/downloadBuildArtifact.py
+++ b/.github/scripts/downloadBuildArtifact.py
@@ -11,16 +11,96 @@ import zipfile
 import time
 from common import *
 import globals
+import re
+import tarfile
+import os
+import subprocess
+
+def WaitOnLatestWorkflow(branch, workflowName, accessToken, raiseErrorOnFailure):
+    logger = GetLogger()
+    logger.info("Looking for a " + workflowName + " build for branch: " + branch)
+
+    latestWorkflowTime = dt.datetime(1970, 1, 1)
+    latestWorkflowId = 0
+
+    while latestWorkflowId == 0:
+        # Look at all workflow runs on the branch
+        res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/runs?branch=" + branch, {}, {'Authorization': "token " + accessToken})
+        runs = res.json()['workflow_runs']
+
+        # This could be optimized if we can guarantee that the results are sorted. It seems like the results are sorted from most recent to least recent, but there
+        # doesn't seem to be any documentation supporting this.
+
+        for entry in runs:
+            if (entry['name'] == workflowName):
+                date = datetime.strptime(entry['created_at'], '%Y-%m-%dT%H:%M:%SZ')
+                if (latestWorkflowTime < date):
+                    latestWorkflowTime = date
+                    latestWorkflowId = entry['id']
+
+    
+    logger.info("Found workflow ID: " + str(latestWorkflowId))
+    buildCompleted = False
+    conclusion = ''
+    # Wait until latestWorkflowId completes
+    while buildCompleted == False:
+        res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/runs/" + str(latestWorkflowId), {}, {'Authorization': "token " + accessToken})
+        if res.ok:
+            resJson = res.json()
+            status = resJson['status']
+            if (status == 'completed'):
+                buildCompleted = True
+                conclusion = resJson['conclusion']
+
+        if not buildCompleted:
+            # Sleep for 30 seconds
+            sleepTimeSeconds = 30
+            logger.error("Sleeping for " + str(sleepTimeSeconds) + " seconds while waiting on build number " + str(latestWorkflowId) + " to finish")
+            time.sleep(sleepTimeSeconds)
+
+    logger.info("Run completed. Link available at:")
+    logger.info("https://github.com/microsoft/gcc/actions/runs/" + str(latestWorkflowId))
+
+    if raiseErrorOnFailure:
+        if conclusion != 'success':
+            RaiseWorkflowError("Workflow did not succeed. Conclusion was: " + conclusion)
+
+def FindGccBuildArtifact(workflowRunID, accessToken):
+    logger = GetLogger()
+    numArtifactRetries = 60
+    while True:
+        res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/runs/" + str(workflowRunID) + "/artifacts", {}, {'Authorization': "token " + accessToken})
+        logger.debug("Artifact Json:" + json.dumps(res.json()))
+
+        gccBuildArtifactID = 0
+        for artifact in res.json()['artifacts']:
+            if (artifact['name'] == "gccBuild"):
+                gccBuildArtifactID = artifact['id']
+
+        if (gccBuildArtifactID != 0):
+            break
+        elif (gccBuildArtifactID == 0 and numArtifactRetries == 0):
+            RaiseWorkflowError("No gcc build artifact found")
+
+        numArtifactRetries = numArtifactRetries-1
+
+        # We need this sleep here because even if the build job status is "completed", the gccBuild artifacts aren't
+        # immediately ready. If for some reason they aren't, sleep for a bit and retry.
+        
+        # Sleep for 30 seconds
+        sleepTimeSeconds = 30
+        logger.info("Sleeping for " + str(sleepTimeSeconds) + " seconds while waiting on artifacts for workflow run ID " + str(workflowRunID))
+        time.sleep(sleepTimeSeconds)
+
+    logger.info("GCC Build artifact ID = " + str(gccBuildArtifactID))
+    return gccBuildArtifactID
 
 def WaitOnGccBuild(commit, accessToken):
     logger = GetLogger()
     logger.info("Looking for a gcc build for " + commit)
     
-    reqArgs = {'check_name': GetGccBuildName()}
-    res = requests.get("https://api.github.com/repos/microsoft/gcc/commits/" + commit + "/check-runs", params=reqArgs, headers={'Authorization': "token " + accessToken})
-
+    res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/commits/" + commit + "/check-runs", {'check_name': GetGccBuildName()}, {'Authorization': "token " + accessToken})
     checkruns = res.json()['check_runs']
-    count = 0
     latestBuildTime = dt.datetime(1970, 1, 1)
     latestBuildId = 0
 
@@ -38,14 +118,13 @@ def WaitOnGccBuild(commit, accessToken):
 
     logger.info("Found latest build workflow ID: " + str(latestBuildId))
 
-    buildStatus = ''
     while True:
         # Get the run ID
-        res = requests.get("https://api.github.com/repos/microsoft/gcc/actions/jobs/" + str(latestBuildId), headers={'Authorization': "token " + accessToken})
+        res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/jobs/" + str(latestBuildId), {}, {'Authorization': "token " + accessToken})
         jobJson = res.json()
         workflowRunID = jobJson['run_id']
-        buildStatus = jobJson['status']
-        if (buildStatus == 'completed'):
+
+        if jobJson['status'] == 'completed':
             logger.debug(jobJson)
             conclusion = jobJson['conclusion']
             if (conclusion == 'success'):
@@ -61,49 +140,57 @@ def WaitOnGccBuild(commit, accessToken):
 
     logger.info("Workflow run ID = " + str(workflowRunID))
 
-    numArtifactRetries = 60
-    # Add error checking for the number of artifacts
-    while True:
-        res = requests.get("https://api.github.com/repos/microsoft/gcc/actions/runs/" + str(workflowRunID) + "/artifacts", headers={'Authorization': "token " + accessToken})
-
-        logger.debug("Artifact Json:" + json.dumps(res.json()))
+    gccBuildArtifactID = FindGccBuildArtifact(workflowRunID, accessToken)
 
-        gccBuildArtifactID = 0
-        for artifact in res.json()['artifacts']:
-            if (artifact['name'] == "gccBuild"):
-                gccBuildArtifactID = artifact['id']
+    return gccBuildArtifactID
 
-        if (gccBuildArtifactID != 0):
-            break
-        elif (gccBuildArtifactID == 0 and numArtifactRetries == 0):
-            RaiseWorkflowError("No gcc build artifact found")
+# Returns a list of artifact objects in the runId that match the regex
+def GetArtifactObjsInRun(runId, regex):
+    r = re.compile(regex)
+    matchedList = []
+    
+    res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/runs/" + str(runId) + "/artifacts", {}, {'Authorization': "token " + globals.configObj.accessToken})
+    artifactsList = res.json()['artifacts']
+    for artifact in artifactsList:
+        name = artifact['name']
+        if r.match(name):
+            matchedList.append(artifact)
+    
+    return matchedList
 
-        numArtifactRetries = numArtifactRetries-1
+def DownloadArtifact(artifactID):
+    logger = GetLogger()
+    res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/artifacts/" + str(artifactID), {}, {'Authorization': "token " + globals.configObj.accessToken})
+    name = res.json()['name']
+    zipName = name + '.zip'
 
-        # We need this sleep here because even if the build job status is "completed", the gccBuild artifacts aren't
-        # immediately ready. If for some reason they aren't, sleep for a bit and retry.
-        
-        # Sleep for 30 seconds
-        sleepTimeSeconds = 30
-        logger.info("Sleeping for " + str(sleepTimeSeconds) + " seconds while waiting on artifacts for workflow run ID " + str(workflowRunID))
-        time.sleep(sleepTimeSeconds)
+    res = SendGetRestCmd("https://api.github.com/repos/microsoft/gcc/actions/artifacts/" + str(artifactID) +"/zip", {}, {'Authorization': "token " + globals.configObj.accessToken})
+    
+    logger.info("Downloading " + name + " zip from artifact ID " + str(artifactID))
+    with open(zipName, 'wb') as f:
+        f.write(res.content)
 
-    logger.info("GCC Build artifact ID = " + str(gccBuildArtifactID))
+    logger.info("Unzipping zip")
+    with zipfile.ZipFile(zipName, 'r') as zip_ref:
+        zip_ref.extractall(name)
+    print('currentCWD')
+    print(os.getcwd())
+    logger.info("Done downloading")
 
-    return gccBuildArtifactID
+    return name
 
 def DownloadBuildArtifact():
     logger = GetLogger()
-    gccBuildArtifactID = globals.configObj.gccBuildArtifactID
+
+    if (globals.configObj.gccBuildArtifactID):
+        gccBuildArtifactID = globals.configObj.gccBuildArtifactID
+    else:
+        gccBuildArtifactID = FindGccBuildArtifact(globals.configObj.runID, globals.configObj.accessToken)
+
     # TODO: Support downloading build artifact without a config object setup so the script can be used outside of workflows by developers
-    res = requests.get("https://api.github.com/repos/microsoft/gcc/actions/artifacts/" + str(gccBuildArtifactID) +"/zip", headers={'Authorization': "token " + globals.configObj.accessToken})
-    
-    logger.info("Downloading gccBuild zip from artifact ID" + str(gccBuildArtifactID))
-    with open('gccBuild.zip', 'wb') as f:
-        f.write(res.content)
+    filename = DownloadArtifact(gccBuildArtifactID)
+    res = subprocess.run('mv ' + filename + '/* ../objdir -f ',
+        shell=True, check=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
 
-    logger.info("Unzipping zip")
-    with zipfile.ZipFile('gccBuild.zip', 'r') as zip_ref:
-        zip_ref.extractall('gccBuild')
+    logger.info("mv cmd output = " + str(res.stdout, 'utf-8'))
 
-    logger.info("Done downloading")
diff --git a/.github/scripts/gccWorkflow.py b/.github/scripts/gccWorkflow.py
index f686e3f6ce2..5fad1ba54e1 100644
--- a/.github/scripts/gccWorkflow.py
+++ b/.github/scripts/gccWorkflow.py
@@ -4,16 +4,24 @@ from downloadBuildArtifact import *
 import sys
 import globals
 from common import *
-import os.path 
+import os
+import json
 
 class GccWorkflow(object):
-    
     # Setup the config object and wait for any runs necessary to finish before proceeding onto the next job section
     # to avoid exceeding the 6 hr limit on Github Actions that run on Github machines
     @staticmethod
     def Init(githubContext, accessToken, isInitForBuild=False):
         Config.Setup(githubContext, accessToken, isInitForBuild)
 
+    @staticmethod
+    def PrintTestSet():
+        dictionary = {
+            "testSet" : ["check-target-libstdc++-v3", "check-gcc-c++", "check-gcc-c", "check-target-libgomp", "check-target-libitm", "check-target-libatomic"]
+        }
+        dictionaryJson = json.dumps(dictionary)
+        print("::set-output name=testSet::" + dictionaryJson)
+
     # Runs the configure script to set up gcc configuration environment prior to building and running tests
     # Creates the objdir directory as part of this process
     @staticmethod
@@ -26,6 +34,17 @@ class GccWorkflow(object):
 
         logger.info("output = " + str(res.stdout, 'utf-8'))
 
+    @staticmethod
+    def MakeObjDir():
+        logger = GetLogger()
+        res = subprocess.run('''
+            cd ..
+            mkdir objdir
+            ''',
+            shell=True, check=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+
+        logger.info("output = " + str(res.stdout, 'utf-8'))
+
     @staticmethod
     def Build(configJson):
         logger = GetLogger()
@@ -33,6 +52,7 @@ class GccWorkflow(object):
         Config.Reload(configJson)
 
         GccWorkflow.Configure()
+        GccWorkflow.MakeObjDir()
 
         # Build
         res = subprocess.run('''
@@ -54,28 +74,18 @@ class GccWorkflow(object):
 
         # Convert the input string (because that's how it's passed via the composite template) into a boolean
         buildDownloaded = buildDownloadedStr == 'True' or buildDownloadedStr == 'true'
-        Config.Reload(configJson, accessToken)  
+        Config.Reload(configJson, accessToken)
+        
+        GccWorkflow.Configure()
         
         if (not buildDownloaded):
             logger.info("Downloading gccBuild artifact from 'build' workflow...")
+            GccWorkflow.MakeObjDir()
             try:
                 DownloadBuildArtifact()
             except:
                 logger.error("Could not download build artifact")
                 RaiseWorkflowError("Error downloading build artifact for GCC workflow")
-            
-            GccWorkflow.Configure()
-            res = subprocess.run('''
-                mv gccBuild/* ../objdir -f
-                ''',
-                shell=True, check=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
-                
-            logger.info("mv cmd output = " + str(res.stdout, 'utf-8'))
-
-            if (res.returncode != 0):
-                logger.error("Failed to copy gccBuild into objdir directory")    
-                logging.shutdown()
-                sys.exit(res.returncode)
         
         # The gcc build should be in the objdir folder one level above
         currentDir = os.getcwd()
@@ -93,7 +103,74 @@ class GccWorkflow(object):
 
         # TODO: Add better handling of errors to display
         if (res.returncode != 0):
-            logger.error("GCC Test failed")
-            logging.shutdown()
-            sys.exit(res.returncode)
+            globals.configObj.failedWorkflowError = "GCC Test failed"
+            globals.configObj.failedReturnCode = res.returncode
+
+        Config.PrintNoSecretConfigJson()
+           
+    @staticmethod
+    def ParseTestLogs(configJson, accessToken):
+        logger = GetLogger()
+        logger.info("GCC Parsing Test Logs")
+
+        Config.Reload(configJson, accessToken)
+
+        # Download all logs
+        logDirs = set()
+        artifacts = GetArtifactObjsInRun(globals.configObj.runID, '.*_logs$')
+        for artifact in artifacts:
+            DownloadArtifact(artifact['id'])
+            logDirs.add(artifact['name'])
+
+            # Each directory should have a failures.txt file
+            assert os.path.exists(artifact['name'] + '/failures.txt')
+
+        # Load in the tests known to fail in our vendor branch
+        with open('contrib/testsuite-management/x86_64-pc-linux-gnu.xfail') as xFailFile:
+            linesFailFile = {line.rstrip('\n') for line in xFailFile}
+
+        newFailuresToAdd = set()
+
+        for logDir in logDirs:
+            logger.info ("Parsing log directory: " + logDir)
+            # Create a rawFailures.txt file with just the unexpected failure lines
+            res = subprocess.run('cd ' + logDir                
+                + '''
+                sed -n '/^Unexpected results in this build/,${p;/^Expected results not present in this build/q}' failures.txt > rawFailures.txt
+                sed -e '1d' -e '$d' -i rawFailures.txt 
+                sed -r '/^\s*$/d' -i rawFailures.txt
+                ''',
+                shell=True, check=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+            logger.info("sed output = " + str(res.stdout, 'utf-8'))
+
+            with open(logDir + '/rawFailures.txt') as rawFailuresFile:
+                failsInThisDir = {line.rstrip('\n') for line in rawFailuresFile}
+            newFails = failsInThisDir - linesFailFile
+
+            newFailuresToAdd = newFailuresToAdd | newFails
+
+        if len(newFailuresToAdd) == 0:
+            logger.info("No new failures detected")
+            return
+
+        with open('contrib/testsuite-management/x86_64-pc-linux-gnu.xfail', 'a') as xFailFile:
+            logger.info("New failures found:")
+            for newFail in newFailuresToAdd:
+                logger.info(newFail)
+                xFailFile.write("\n")
+                xFailFile.write(newFail)
+
+
+        # Create a patch file 
+        res = subprocess.run('''
+                git diff contrib/testsuite-management/x86_64-pc-linux-gnu.xfail > newFailures.patch
+                ''',
+                shell=True, check=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+        logger.info("git diff output = " + str(res.stdout, 'utf-8'))
         
+        res = subprocess.run('''
+                git add contrib/testsuite-management/x86_64-pc-linux-gnu.xfail
+                git commit -m "Update xfail with new failures"
+                ''',
+                shell=True, check=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+        logger.info("git add and commit output = " + str(res.stdout, 'utf-8'))
diff --git a/.github/workflows/build.yaml b/.github/workflows/build.yaml
index ccab43c927c..8b6fe7f9932 100644
--- a/.github/workflows/build.yaml
+++ b/.github/workflows/build.yaml
@@ -6,17 +6,13 @@ on:
   push:
     branches:
       - current
-      - test
       - 'releases/**'
-      - 'develop/**'
-      - 'gcc**'
+      - gcc-9.1.0
   pull_request:
     branches:
       - current
-      - test
       - 'releases/**'
-      - 'develop/**'
-      - 'gcc**'
+      - gcc-9.1.0
   workflow_dispatch:
 
 jobs:    
@@ -37,8 +33,7 @@ jobs:
         
       # Install requests package which is used in downloadBuildArtifact.py 
       - name: Pip Install Requests
-        run: |
-          python -m pip install requests
+        run: python -m pip install requests
         shell: bash
 
       # Setup config
@@ -56,7 +51,8 @@ jobs:
           GITHUB_CONTEXT: ${{ toJson(github) }}
           GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
 
-      - uses: ./.github/actions/build-composite
+      - name: Build GCC
+        uses: ./.github/actions/build-composite
         with: 
           configjson: ${{ steps.setupconfig.outputs.configJson }}
 
diff --git a/.github/workflows/fetch-rebase-test.yaml b/.github/workflows/fetch-rebase-test.yaml
index f41ba70c98f..5116b5889d1 100644
--- a/.github/workflows/fetch-rebase-test.yaml
+++ b/.github/workflows/fetch-rebase-test.yaml
@@ -11,6 +11,7 @@ jobs:
     runs-on: ubuntu-18.04
     outputs:
       config: ${{ steps.printNoSecretJson.outputs.noSecretConfigJson }}
+      matrixTestSet: ${{ steps.printGCCTestSet.outputs.testSet }}
     steps:
       - name: Checkout
         uses: actions/checkout@v2
@@ -30,7 +31,12 @@ jobs:
         run: |
           python -m pip install requests
         shell: bash
-        
+
+      # Print the chunked set of GCC tests we'd like to run in a matrix later
+      - name: Print GCC test set
+        id: printGCCTestSet
+        uses: ./.github/actions/print-gcc-testset
+
       - uses: ./.github/actions/rebase-gcc-master
       
         # Setup config
@@ -48,7 +54,8 @@ jobs:
           GITHUB_CONTEXT: ${{ toJson(github) }}
           GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
           
-      - uses: ./.github/actions/build-composite
+      - name: Build GCC
+        uses: ./.github/actions/build-composite
         with: 
           configjson: ${{ steps.setupconfig.outputs.configJson }}
 
@@ -67,7 +74,7 @@ jobs:
         run: |
           chmod +x .github/scripts/config.py
           export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
-          python -c 'import sys; from config import *; Config.PrintNoSecretConfigJson(sys.argv[1])' "${CONFIG_JSON}"
+          python -c 'import sys; from config import *; Config.PrintNoSecretConfigJsonFromJson(sys.argv[1])' "${CONFIG_JSON}"
         shell: bash
         env:
           CONFIG_JSON: ${{ steps.setupconfig.outputs.configJson }}
@@ -75,9 +82,7 @@ jobs:
   fetch-rebase-test: 
     needs: fetch-rebase-build
     strategy:
-      matrix:
-        # If this test is updated, make sure you update the duplicate line in test-gcc.yaml
-        testSet: [check-target-libstdc++-v3, check-gcc-c++, check-gcc-c, check-target-libgomp, check-target-libitm, check-target-libatomic]
+      matrix: ${{fromJSON(needs.fetch-rebase-build.outputs.matrixTestSet)}}
       # Avoid cancelling other matrix chunks even if one fails
       fail-fast: false        
     runs-on: ubuntu-18.04
@@ -101,7 +106,8 @@ jobs:
           python -m pip install requests
         shell: bash
         
-      - uses: ./.github/actions/rebase-gcc-master
+      - name: Rebase to latest upstream GCC master
+        uses: ./.github/actions/rebase-gcc-master
 
       - name: Download gccBuild from fetch-rebase-build step
         uses: actions/download-artifact@v2
@@ -113,7 +119,8 @@ jobs:
         run: mv objdir ../objdir
         
       # build.yaml creates the gccBuild artifact
-      - uses: ./.github/actions/test-composite
+      - name: Run GCC tests
+        uses: ./.github/actions/test-composite
         with: 
           configjson: ${{needs.fetch-rebase-build.outputs.config}}
           testSet: ${{ matrix.testSet }}
diff --git a/.github/workflows/test-gcc.yaml b/.github/workflows/test-gcc.yaml
index e59bb629bb3..5d0b2f29c8d 100644
--- a/.github/workflows/test-gcc.yaml
+++ b/.github/workflows/test-gcc.yaml
@@ -8,17 +8,13 @@ on:
   push:
     branches:
       - current
-      - test
       - 'releases/**'
-      - 'develop/**'
-      - 'gcc**'
+      - gcc-9.1.0
   pull_request:
     branches:
       - current
-      - test
       - 'releases/**'
-      - 'develop/**'
-      - 'gcc**'
+      - gcc-9.1.0
   # the problem with workflow_run is that the run isn't associated with a PR and won't show up in a PR
   #workflow_run:
   #  workflows: ["build"]
@@ -31,6 +27,7 @@ jobs:
     runs-on: ubuntu-18.04
     outputs:
       config: ${{ steps.printNoSecretJson.outputs.noSecretConfigJson }}
+      matrixTestSet: ${{ steps.printGCCTestSet.outputs.testSet }}
     steps:
       - name: checkout
         uses: actions/checkout@v2
@@ -50,6 +47,11 @@ jobs:
           python -m pip install requests
         shell: bash
 
+      # Print the chunked set of GCC tests we'd like to run in a matrix later
+      - name: Print GCC test set
+        id: printGCCTestSet
+        uses: ./.github/actions/print-gcc-testset
+
       # Setup config
       - name: Setup config
         id: setupconfig
@@ -68,16 +70,14 @@ jobs:
         run: |
           chmod +x .github/scripts/config.py
           export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
-          python -c 'import sys; from config import *; Config.PrintNoSecretConfigJson(sys.argv[1])' "${CONFIG_JSON}"
+          python -c 'import sys; from config import *; Config.PrintNoSecretConfigJsonFromJson(sys.argv[1])' "${CONFIG_JSON}"
         shell: bash
         env:
           CONFIG_JSON: ${{ steps.setupconfig.outputs.configJson }}
   test:
     needs: init
     strategy:
-      matrix:
-        # If this test is updated, make sure you update the duplicate line in fetch-rebase-test.yaml
-        testSet: [check-target-libstdc++-v3, check-gcc-c++, check-gcc-c, check-target-libgomp, check-target-libitm, check-target-libatomic]
+      matrix: ${{fromJSON(needs.init.outputs.matrixTestSet)}}
       # Avoid cancelling other matrix chunks even if one fails
       fail-fast: false        
     runs-on: ubuntu-18.04
@@ -100,19 +100,20 @@ jobs:
         
     # Install requests package which is used in downloadBuildArtifact.py 
       - name: Pip Install Requests
-        run: |
-          python -m pip install requests
+        run: python -m pip install requests
         shell: bash
 
       # build.yaml creates the gccBuild artifact
-      - uses: ./.github/actions/test-composite
+      - name: Run GCC tests
+        uses: ./.github/actions/test-composite
+        id: test-composite
         with: 
           configjson: ${{needs.init.outputs.config}}
           testSet: ${{ matrix.testSet }}
           githubtoken: ${{ secrets.GITHUB_TOKEN }}
           buildDownloaded: False
 
-      - name: Move objdir to be in repo so logs inside can be uploaded
+      - name: Move objdir to be in repo so logs inside can be uploaded 
         run: mv ../objdir objdir
 
       - name: Upload build output
@@ -121,3 +122,17 @@ jobs:
           name: ${{ matrix.testSet }}_logs
           path: objdir/logs
       
+      # We have to do this after because there's no REST API to upload build artifacts so we need to use the yaml one
+      - name: Fail workflow if there was a failure earlier
+        run: |
+          chmod +x .github/scripts/common.py
+          echo "$PYTHONPATH"
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          echo "$PYTHONPATH"
+          echo "${GITHUB_CONTEXT}"
+          python -c 'import sys; from gccWorkflow import *; Config.Reload(sys.argv[1], sys.argv[2]); Config.RaiseErrorIfWorkflowFailed()' "${GITHUB_CONTEXT}" "${GITHUB_TOKEN}"
+        shell: bash
+        env:
+          GITHUB_CONTEXT: ${{steps.test-composite.outputs.configJson}}
+          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+        
diff --git a/.github/workflows/update-main.yaml b/.github/workflows/update-main.yaml
new file mode 100644
index 00000000000..e420a08f1b6
--- /dev/null
+++ b/.github/workflows/update-main.yaml
@@ -0,0 +1,354 @@
+# This workflow does the following things:
+# 1. Takes the latest SHA from the upstream GCC master branch and starts a build and GCC test run
+# 2. Any new failures encountered during the run and not in the .xfail file in vendors/microsoft/main will get added to a the .xfail file.
+# 3. A new branch will be created in the GitHub repository with the gcc/master SHA we used earlier merged into vendors/microsoft/main and the newly added .xfail failures (if any)
+# 4. A build and test-gcc workflow will then get kicked off against this new branch
+# 5. This workflow will wait until the build and test-gcc workflows are done.
+# 6. If the build and test-gcc steps are successful, it will print the set of Git commands for someone to run manually to actually push the changes to our vendor branch
+name: update-main
+
+# Run this workflow every day at 3 am UTC
+on:
+  schedule:
+    - cron: "0 3 * * *"
+  workflow_dispatch:
+
+jobs:
+  gcc-master-build:
+    name: Build the latest gcc/master toolchain
+    runs-on: ubuntu-18.04
+    outputs:
+      config: ${{ steps.printNoSecretJson.outputs.noSecretConfigJson }}
+      masterSHA: ${{ steps.printCurrentSHA.outputs.currentSHA }}
+      matrixTestSet: ${{ steps.printGCCTestSet.outputs.testSet }}
+    steps:
+      - name: Checkout
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          submodules: recursive
+          lfs: true
+    
+      - name: Setup Python 3.7
+        uses: actions/setup-python@v1
+        with:
+          python-version: 3.7
+        
+      # Install requests package which is used in downloadBuildArtifact.py 
+      - name: Pip Install Requests
+        run: python -m pip install requests
+        shell: bash
+
+      # Add upstream gcc as remote and checkout from their master branch
+      - name: Add upstream remote and checkout upstream gcc master branch
+        uses: ./.github/actions/pull-master-upstream
+        with:
+          scriptsRef: ${{ github.sha }}
+
+      # Print the current SHA so we know what master SHA we synced to and can sync to it later
+      - name: Print current SHA
+        id: printCurrentSHA
+        run: |
+          chmod +x .github/scripts/common.py
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          python -c 'from common import *; GetandPrintCurrentSHA()'
+        shell: bash
+      
+      # Print the chunked set of GCC tests we'd like to run in a matrix later
+      - name: Print GCC test set
+        id: printGCCTestSet
+        uses: ./.github/actions/print-gcc-testset
+
+        # Setup config
+      - name: Setup config
+        id: setupconfig
+        run: |
+          chmod +x .github/scripts/gccWorkflow.py
+          echo "$PYTHONPATH"
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          echo "$PYTHONPATH"
+          echo "${GITHUB_CONTEXT}"
+          python -c 'import sys; from gccWorkflow import *; GccWorkflow.Init(sys.argv[1], sys.argv[2], True)' "${GITHUB_CONTEXT}" "${GITHUB_TOKEN}"
+        shell: bash
+        env:
+          GITHUB_CONTEXT: ${{ toJson(github) }}
+          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+          
+      - name: Build GCC
+        uses: ./.github/actions/build-composite
+        with: 
+          configjson: ${{ steps.setupconfig.outputs.configJson }}
+
+      - name: Move objdir to be in repo so it can be uploaded
+        run: mv ../objdir objdir
+
+      - name: Upload build output
+        uses: actions/upload-artifact@v2
+        with:
+          name: gccBuild
+          path: objdir
+
+      # This should be the last step on this machine since it will clear out fields in the config json
+      - name: Print No Secret Json
+        id: printNoSecretJson
+        run: |
+          chmod +x .github/scripts/config.py
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          python -c 'import sys; from config import *; Config.PrintNoSecretConfigJsonFromJson(sys.argv[1])' "${CONFIG_JSON}"
+        shell: bash
+        env:
+          CONFIG_JSON: ${{ steps.setupconfig.outputs.configJson }}
+          
+  gcc-master-test: 
+    name: Run tests against gcc/master bits built in the previous step
+    needs: gcc-master-build
+    strategy:
+      matrix: ${{fromJSON(needs.gcc-master-build.outputs.matrixTestSet)}}
+      # Avoid cancelling other matrix chunks even if one fails
+      fail-fast: false
+
+    runs-on: ubuntu-18.04
+    steps:
+      - name: Checkout
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          submodules: recursive
+          lfs: true
+
+      - name: Setup Python 3.7
+        uses: actions/setup-python@v1
+        with:
+          python-version: 3.7
+        
+      # Install requests package which is used in downloadBuildArtifact.py 
+      - name: Pip Install Requests
+        run: python -m pip install requests
+        shell: bash
+        
+      # Add upstream gcc as remote and checkout the SHA we used in the master build
+      - name: Add upstream remote and checkout SHA used in build step
+        uses: ./.github/actions/pull-master-upstream
+        with:
+          scriptsRef: ${{ github.sha }}
+          masterRef: ${{ needs.gcc-master-build.outputs.masterSHA }}
+        
+      # We need to download the build artifact here instead of in DownloadBuildArtifact() because artifacts are not available
+      # for download through the REST API until after the workflow has completed. We can only get the workflows from the
+      # previous steps through the download-artifact action.
+      - name: Download gccBuild artifact
+        uses: actions/download-artifact@v2
+        with:
+          name: gccBuild
+          path: objdir
+
+      - name: Move objdir outisde repo
+        run: mv objdir ../
+
+      # build.yaml creates the gccBuild artifact
+      - name: Run GCC tests
+        uses: ./.github/actions/test-composite
+        with: 
+          configjson: ${{needs.gcc-master-build.outputs.config}}
+          testSet: ${{ matrix.testSet }}
+          githubtoken: ${{ secrets.GITHUB_TOKEN }}
+          buildDownloaded: True 
+      
+      - name: Move objdir to be in repo so logs inside can be uploaded
+        run: mv ../objdir objdir
+
+      - name: Upload test logs
+        uses: actions/upload-artifact@v2
+        with:
+          name: master_${{ matrix.testSet }}_logs
+          path: objdir/logs
+
+  parse-failures:
+    name: Parse failures and start build and test runs with gcc/master merged into vendor/microsoft/main
+    needs: [gcc-master-test, gcc-master-build]
+    runs-on: ubuntu-18.04
+    outputs:
+      newBranchName: ${{ steps.setBranchName.outputs.newBranchName }}
+    steps:
+      - name: Checkout
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          submodules: recursive
+          lfs: true
+          token: ${{ secrets.VICTORPAT }} # The basic ${{ github.token }} doesn't include "workflows" write permission access to modify workflows in the .github directory
+
+      - name: Setup Python 3.7
+        uses: actions/setup-python@v1
+        with:
+          python-version: 3.7
+          
+      # Install requests package which is used in downloadBuildArtifact.py 
+      - name: Pip Install Requests
+        run: |
+          python -m pip install requests
+        shell: bash
+
+      # Add upstream gcc as remote, setup vendor branches and checkout vendors/microsoft/main
+      - name: Setup and checkout vendors/microsoft/main
+        uses: ./.github/actions/setup-vendor-branches
+        with:
+          scriptsRef: ${{ github.sha }}
+          vendorRef: vendors/microsoft/main
+
+      - name: Parse failures
+        run: | 
+          chmod +x .github/scripts/gccWorkflow.py
+          echo "$PYTHONPATH"
+          echo "$CONFIG_JSON"
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          echo "$PYTHONPATH"
+          echo "$BUILD_DOWNLOADED"
+          python -c 'import sys; from gccWorkflow import *; GccWorkflow.ParseTestLogs(sys.argv[1], sys.argv[2])' "${CONFIG_JSON}" "${GITHUB_TOKEN}"
+        shell: bash
+        env:
+          CONFIG_JSON: ${{needs.gcc-master-build.outputs.config}}
+          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+
+      # newFailures.patch is created in ParseTestLogs IFF there were failures in master run that aren't in the .xfail file
+      - name: Check if failures patch exists
+        id: check_failures_patch
+        uses: andstor/file-existence-action@v1
+        with:
+          files: "newFailures.patch"
+
+      - name: Upload failures patch
+        if: ${{ steps.check_failures_patch.outputs.files_exists }} == 'true'
+        uses: actions/upload-artifact@v2
+        with:
+          name: newFailures.patch
+          path: newFailures.patch
+
+      - name: Get current date
+        id: date
+        run: echo "::set-output name=date::$(date +'%Y-%m-%dT%H-%M-%S')"
+
+      - name: Set new branch name
+        id: setBranchName
+        run: echo "::set-output name=newBranchName::${name}"
+        env:
+          name: merge-master-${{ needs.gcc-master-build.outputs.masterSHA }}-${{ steps.date.outputs.date }}
+
+      # Use .github scripts in this branch in case the workflow is testing infrastructure changes
+      - name: Merge master SHA and push to our GitHub repo to start run
+        run: |
+          git checkout -b ${newBranchName}
+          git add .github/*
+          git commit -m "Bring latest scripts from ${BRANCH_REF}"
+          git fetch gcc master
+          git branch -a
+          git merge ${masterSHA}
+          git push origin HEAD -f
+        shell: bash
+        env:
+          newBranchName: ${{ steps.setBranchName.outputs.newBranchName }}
+          masterSHA: ${{ needs.gcc-master-build.outputs.masterSHA }}
+          BRANCH_REF: ${{ github.sha }}
+
+      - name: Invoke build workflow
+        uses: benc-uk/workflow-dispatch@v1
+        with:
+          workflow: build
+          token: ${{ secrets.VICTORPAT }}
+          ref: ${{ steps.setBranchName.outputs.newBranchName }}
+
+      - name: Invoke test-gcc workflow
+        uses: benc-uk/workflow-dispatch@v1
+        with:
+          workflow: test-gcc
+          token: ${{ secrets.VICTORPAT }}
+          ref: ${{ steps.setBranchName.outputs.newBranchName }}
+
+      - name: Sleep before waiting on build and test runs
+        run: |
+          sleep 3m
+        shell: bash
+
+  wait-build:
+    name: Waiting on build with gcc/master merged into vendor/microsoft/main
+    needs: [parse-failures]
+    runs-on: ubuntu-18.04
+    outputs: 
+      newFailuresPatchExists: ${{ steps.check_failures_patch.outputs.files_exists }}
+    steps:
+      - name: Checkout
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          submodules: recursive
+          lfs: true
+
+      - name: Setup Python 3.7
+        uses: actions/setup-python@v1
+        with:
+          python-version: 3.7
+      
+      # Install requests package which is used in downloadBuildArtifact.py 
+      - name: Pip Install Requests
+        run: |
+          python -m pip install requests
+        shell: bash
+        
+      - name: Wait for build
+        run: | 
+          chmod +x .github/scripts/downloadBuildArtifact.py
+          echo "$PYTHONPATH"
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          echo "$PYTHONPATH"
+          echo "${GITHUB_CONTEXT}"
+          python -c 'import sys; from downloadBuildArtifact import *; WaitOnLatestWorkflow(sys.argv[1], sys.argv[2], sys.argv[3], True)' "${mergeBranch}" "build" "${GITHUB_TOKEN}"
+        shell: bash
+        env:
+          mergeBranch: ${{ needs.parse-failures.outputs.newBranchName }}
+          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+
+  wait-test:
+    name: Waiting on tests with gcc/master merged into vendor/microsoft/main
+    needs: [parse-failures, wait-build, gcc-master-build]
+    runs-on: ubuntu-18.04
+    outputs: 
+      newFailuresPatchExists: ${{ steps.check_failures_patch.outputs.files_exists }}
+    steps:
+      - name: Checkout
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          submodules: recursive
+          lfs: true
+
+      - name: Setup Python 3.7
+        uses: actions/setup-python@v1
+        with:
+          python-version: 3.7
+          
+      # Install requests package which is used in downloadBuildArtifact.py 
+      - name: Pip Install Requests
+        run: |
+          python -m pip install requests
+        shell: bash
+
+      - name: Wait for test-gcc
+        run: |
+          chmod +x .github/scripts/downloadBuildArtifact.py
+          echo "$PYTHONPATH"
+          export PYTHONPATH=${PYTHONPATH}:${PWD}/.github/scripts
+          echo "$PYTHONPATH"
+          echo "${GITHUB_CONTEXT}"
+          python -c 'import sys; from downloadBuildArtifact import *; WaitOnLatestWorkflow(sys.argv[1], sys.argv[2], sys.argv[3], True)' "${mergeBranch}" "test-gcc" "${GITHUB_TOKEN}"
+        shell: bash
+        env:
+          mergeBranch: ${{ needs.parse-failures.outputs.newBranchName }}
+          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
+      
+      - name: Print git commands to use to merge and push manually.
+        run: |
+          printf "Run the following command. It assumes that you have the microsoft/gcc repository at origin and the upstream GCC Microsoft vendor branches set up at gcc"
+          printf "git fetch gcc \ngit checkout gcc vendors/microsoft/main \ngit merge origin ${mergeBranch} \ngit push gcc vendors/microsoft/main \n"
+        env:
+          masterSHA: ${{ needs.gcc-master-build.outputs.masterSHA }}
+          mergeBranch: ${{ needs.parse-failures.outputs.newBranchName }}
diff --git a/.github/workflows/update-mirror-branches.yaml b/.github/workflows/update-mirror-branches.yaml
new file mode 100644
index 00000000000..c95fe00db98
--- /dev/null
+++ b/.github/workflows/update-mirror-branches.yaml
@@ -0,0 +1,61 @@
+# This workflow mirrors our vendor branches into our GitHub repository to trigger the CI workflow if any new changes are present.
+# The workflow will automatically push these changes and will not require manual intervention
+name: update-mirror-branches
+
+# Run this workflow every 10 minutes
+on:
+  schedule:
+    - cron: "0,10,20,30,40,50 * * * *"
+  workflow_dispatch:
+
+jobs:
+  sync:
+    runs-on: ubuntu-18.04
+    strategy:
+      matrix:
+        upstreamBranch: [""]
+        githubBranch: [""]
+
+        # upstream vendor/microsoft/main maps to current
+        # upstream vendor/microsoft/9.1.0 maps to gcc-9.1.0
+        include:
+          - upstreamBranch: main
+            githubBranch: current-temp #TODO: Change this when we have everything in current in the upstream vendor branch
+          - upstreamBranch: '9.1.0'
+            githubBranch: gcc-9.1.0
+
+        # Exclude the default case because we can't have empty matricies
+        exclude:
+          - upstreamBranch: ""
+            githubBranch: ""
+
+      # Avoid cancelling other matrix chunks even if one fails
+      fail-fast: false    
+    steps:
+      - name: Checkout  
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          submodules: recursive
+          lfs: true
+          token: ${{ secrets.VICTORPAT }} # The basic ${{ github.token }} doesn't include "workflows" write permission access to modify workflows in the .github directory
+
+      # Add upstream gcc as remote, setup vendor branches and checkout matrix vendor branch
+      - name: Setup and checkout vendors/microsoft/${{ matrix.upstreamBranch }}
+        uses: ./.github/actions/setup-vendor-branches
+        with:
+          scriptsRef: ${{ github.sha }}
+          vendorRef: vendors/microsoft/${{ matrix.upstreamBranch }}
+
+      - name: Get current date
+        id: date
+        run: echo "::set-output name=date::$(date +'%Y-%m-%dT%H-%M-%S')"
+
+      - name: Push to branch in GitHub repo
+        run: |
+          git checkout -b ${newBranchName}
+          git push origin -f ${newBranchName}:${githubBranch}
+        shell: bash
+        env:
+          newBranchName: update-mirror-${{ matrix.githubBranch }}-${{ steps.date.outputs.date }}
+          githubBranch: ${{ matrix.githubBranch }}


^ permalink raw reply	[flat|nested] only message in thread

only message in thread, other threads:[~2021-05-27  1:20 UTC | newest]

Thread overview: (only message) (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2021-05-27  1:20 [gcc(refs/vendors/microsoft/heads/main)] Add workflows to update from master and mirror vendor branches (#50) Eugene Rozenfeld

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).