Increase retries for downloading from GS

We're seeing semi-regular failues downloading files from GS via Bazel.
While we wait for an investigation from the GS team into the root cause,
increase the timeout in the hopes that this is just a flake that will be
solved by more retries.

BUG=b:338670280
TEST=CQ passes

Change-Id: Ief7a125ac6e2d833df47903af963f3e70533dbba
Reviewed-on: https://chromium-review.googlesource.com/c/chromiumos/bazel/+/5531437
Tested-by: Tim Bain <tbain@google.com>
Commit-Queue: Raul Rangel <rrangel@chromium.org>
Reviewed-by: Raul Rangel <rrangel@chromium.org>
Auto-Submit: Tim Bain <tbain@google.com>
Commit-Queue: Tim Bain <tbain@google.com>
diff --git a/repo_defs/gs.bzl b/repo_defs/gs.bzl
index ac615d0..b80286a 100644
--- a/repo_defs/gs.bzl
+++ b/repo_defs/gs.bzl
@@ -53,7 +53,10 @@
     ),
 }
 
-_MAX_RETRIES = 2
+# We'd like to use a value of 2 here, but due to b/338670280 we're hitting
+# semi-frequent failures with that value. Let's bump to 4 retries (5 total
+# tries) to try to work around the problem.
+_MAX_RETRIES = 4
 
 def download_gs_file(repository_ctx):
     repository_ctx.report_progress("Downloading from GS.")