devserver: check presence of actual files when asserting staging
We have been seeing failures of lab tests due to staged files that are
missing despite the devserver reporting their artifact as staged. Since
the devserver uses an (empty) marker file to denote that an artifact has
been staged, this probably means that we suffer from either (i) a rare
bug that causes us to place a marker file for an artifact that was not
fully stage; or (ii) a rare race condition where files of a previously
staged artifact are being deleted, but not the marker file.
This CL works around both problem using brute force: it stores the list
of files that were actually "installed" (i.e. found their way to the
cache directory) explicitly in the artifact's marker file. When checking
whether an artifact has been staged, it reads the list of files and
ensures that each one of them is present. Some technical notes:
* When we fail to find one or more of the listed files, an loud error
message is logged (which includes the list of missing files) and the
marker flag is removed entirely. This shortens the time needed for
subsequent queries.
* To keep the logic simple, we store absolute file paths, which are easy
to check, albeit not portable; this shouldn't be a problem assuming we
do not expect a devserver's cache to mobilize.
* We list only files (including symlinks), hence the requirement for
directories presence is only as implied by file paths. We further list
all files that were copied to the cache directory, which may include
(for example) the archive that was downloaded from GS and used for
extracting one or more files. This may be relaxed in the future if we
agree that such temporary files are not being used further; it has to
be relaxed if we decide to remove these archives after extraction
(e.g. to preserve cache space).
* The staging verification logic is written such that the files listed
in the marker file constitute the smallest set of files whose presence
entails the artifact being staged. This means that existing devserver
caches (e.g. in the lab) will remain consistent wrt staging queries.
* Some artifacts (currently, the Autotest tarball artifact) opt not to
store the full list of files. As noted above, this is a valid choice
and simply means that, for that artifact, staging verification will
have the same semantics it had before this change. Storing explicit
file lists is harder to justify for artifacts the contain a large
number of files, such as the Autotest tarball. We may decide to undo
this exception in the future if we keep getting inconsistent staging
verification results for these artifacts.
* Unit tests were extended to validate the content of marker files for
each tested artifact. A new test was added for the staging
verification logic.
BUG=chromium:277839
TEST=Unit testing of new staging logic.
TEST=All existing unit tests.
Change-Id: Ib2ee1d56dbe31da095c3afd23f05d5fdf1200b0f
Reviewed-on: https://chromium-review.googlesource.com/176557
Tested-by: Gilad Arnold <garnold@chromium.org>
Reviewed-by: Don Garrett <dgarrett@chromium.org>
Reviewed-by: Chris Sosa <sosa@chromium.org>
Commit-Queue: Gilad Arnold <garnold@chromium.org>
diff --git a/build_artifact.py b/build_artifact.py
index 9c8e5d4..d1623b6 100755
--- a/build_artifact.py
+++ b/build_artifact.py
@@ -75,6 +75,10 @@
install_dir: The final location where the artifact should be staged to.
single_name: If True the name given should only match one item. Note, if not
True, self.name will become a list of items returned.
+ installed_files: A list of files that were the final result of downloading
+ and setting up the artifact.
+ store_installed_files: Whether the list of installed files is stored in the
+ marker file.
"""
def __init__(self, install_dir, archive_url, name, build,
@@ -114,6 +118,9 @@
self.single_name = True
+ self.installed_files = []
+ self.store_installed_files = True
+
@staticmethod
def _SanitizeName(name):
"""Sanitizes name to be used for creating a file on the filesystem.
@@ -128,13 +135,42 @@
return name.replace('*', 'STAR').replace('.', 'DOT').replace('/', 'SLASH')
def ArtifactStaged(self):
- """Returns True if artifact is already staged."""
- return os.path.exists(os.path.join(self.install_dir, self.marker_name))
+ """Returns True if artifact is already staged.
+
+ This checks for (1) presence of the artifact marker file, and (2) the
+ presence of each installed file listed in this marker. Both must hold for
+ the artifact to be considered staged. Note that this method is safe for use
+ even if the artifacts were not stageed by this instance, as it is assumed
+ that any BuildArtifact instance that did the staging wrote the list of
+ files actually installed into the marker.
+ """
+ marker_file = os.path.join(self.install_dir, self.marker_name)
+
+ # If the marker is missing, it's definitely not staged.
+ if not os.path.exists(marker_file):
+ return False
+
+ # We want to ensure that every file listed in the marker is actually there.
+ if self.store_installed_files:
+ with open(marker_file) as f:
+ files = [line.strip() for line in f]
+
+ # Check to see if any of the purportedly installed files are missing, in
+ # which case the marker is outdated and should be removed.
+ missing_files = [fname for fname in files if not os.path.exists(fname)]
+ if missing_files:
+ self._Log('***ATTENTION*** %s files listed in %s are missing:\n%s',
+ 'All' if len(files) == len(missing_files) else 'Some',
+ marker_file, '\n'.join(missing_files))
+ os.remove(marker_file)
+ return False
+
+ return True
def _MarkArtifactStaged(self):
"""Marks the artifact as staged."""
with open(os.path.join(self.install_dir, self.marker_name), 'w') as f:
- f.write('')
+ f.write('\n'.join(self.installed_files))
def WaitForArtifactToExist(self, timeout, update_name=True):
"""Waits for artifact to exist and sets self.name to appropriate name.
@@ -170,8 +206,10 @@
gsutil_util.DownloadFromGS(gs_path, self.install_path)
def _Setup(self):
- """For tarball like artifacts, extracts and prepares contents."""
- pass
+ """Process the downloaded content, update the list of installed files."""
+ # In this primitive case, what was downloaded (has to be a single file) is
+ # what's installed.
+ self.installed_files = [self.install_path]
def _ClearException(self):
"""Delete any existing exception saved for this artifact."""
@@ -268,6 +306,10 @@
devserver_constants.UPDATE_FILE)
shutil.move(install_path, new_install_path)
+ # Reflect the rename in the list of installed files.
+ self.installed_files.remove(install_path)
+ self.installed_files = [new_install_path]
+
# TODO(sosa): Change callers to make this artifact more sane.
class DeltaPayloadsArtifact(BuildArtifact):
@@ -317,11 +359,16 @@
try:
artifact.Process(no_wait=True)
# Setup symlink so that AU will work for this payload.
+ stateful_update_symlink = os.path.join(
+ artifact.install_dir, devserver_constants.STATEFUL_FILE)
os.symlink(
os.path.join(os.pardir, os.pardir,
devserver_constants.STATEFUL_FILE),
- os.path.join(artifact.install_dir,
- devserver_constants.STATEFUL_FILE))
+ stateful_update_symlink)
+
+ # Aggregate sub-artifact file lists, including stateful symlink.
+ self.installed_files += artifact.installed_files
+ self.installed_files.append(stateful_update_symlink)
except ArtifactDownloadError as e:
self._Log('Could not process %s: %s', artifact, e)
@@ -357,12 +404,17 @@
"""Extracts the bundle into install_dir. Must be overridden.
If set, uses files_to_extract to only extract those items. If set, use
- exclude to exclude specific files.
+ exclude to exclude specific files. In any case, this must return the list
+ of files extracted (absolute paths).
"""
raise NotImplementedError()
def _Setup(self):
- self._Extract()
+ extract_result = self._Extract()
+ if self.store_installed_files:
+ # List both the archive and the extracted files.
+ self.installed_files.append(self.install_path)
+ self.installed_files.extend(extract_result)
class TarballBuildArtifact(BundledBuildArtifact):
@@ -375,9 +427,10 @@
extension and extracts the tarball into the install_path.
"""
try:
- common_util.ExtractTarball(self.install_path, self.install_dir,
- files_to_extract=self._files_to_extract,
- excluded_files=self._exclude)
+ return common_util.ExtractTarball(self.install_path, self.install_dir,
+ files_to_extract=self._files_to_extract,
+ excluded_files=self._exclude,
+ return_extracted_files=True)
except common_util.CommonUtilError as e:
raise ArtifactDownloadError(str(e))
@@ -385,6 +438,12 @@
class AutotestTarballBuildArtifact(TarballBuildArtifact):
"""Wrapper around the autotest tarball to download from gsutil."""
+ def __init__(self, *args):
+ super(AutotestTarballBuildArtifact, self).__init__(*args)
+ # We don't store/check explicit file lists in Autotest tarball markers;
+ # this can get huge and unwieldy, and generally make little sense.
+ self.store_installed_files = False
+
def _Setup(self):
"""Extracts the tarball into the install path excluding test suites."""
super(AutotestTarballBuildArtifact, self)._Setup()
@@ -411,11 +470,13 @@
class ZipfileBuildArtifact(BundledBuildArtifact):
"""A downloadable artifact that is a zipfile."""
- def _Extract(self):
- """Extracts files into the install path."""
- # Unzip is weird. It expects its args before any excepts and expects its
- # excepts in a list following the -x.
- cmd = ['unzip', '-o', self.install_path, '-d', self.install_dir]
+ def _RunUnzip(self, list_only):
+ # Unzip is weird. It expects its args before any excludes and expects its
+ # excludes in a list following the -x.
+ cmd = ['unzip', '-qql' if list_only else '-o', self.install_path]
+ if not list_only:
+ cmd += ['-d', self.install_dir]
+
if self._files_to_extract:
cmd.extend(self._files_to_extract)
@@ -424,12 +485,22 @@
cmd.extend(self._exclude)
try:
- subprocess.check_call(cmd)
+ return subprocess.check_output(cmd).strip('\n').splitlines()
except subprocess.CalledProcessError, e:
raise ArtifactDownloadError(
'An error occurred when attempting to unzip %s:\n%s' %
(self.install_path, e))
+ def _Extract(self):
+ """Extracts files into the install path."""
+ file_list = [os.path.join(self.install_dir, line[30:].strip())
+ for line in self._RunUnzip(True)
+ if not line.endswith('/')]
+ if file_list:
+ self._RunUnzip(False)
+
+ return file_list
+
class ImplDescription(object):
"""Data wrapper that describes an artifact's implementation."""
diff --git a/build_artifact_unittest.py b/build_artifact_unittest.py
index 82011d0..46e9249 100755
--- a/build_artifact_unittest.py
+++ b/build_artifact_unittest.py
@@ -11,6 +11,7 @@
"""
import os
+import random
import shutil
import subprocess
import tempfile
@@ -19,6 +20,7 @@
import mox
import build_artifact
+import devserver_constants
_VERSION = 'R26-3646.0.0-rc1'
@@ -27,6 +29,69 @@
_TEST_NON_EXISTING_GOLO_ARCHIVE = (
'gs://chromeos-image-archive/x86-generic-chromium-pfq/R26-no_such_build')
+_TEST_GOLO_ARCHIVE_TEST_TARBALL_CONTENT = [
+ 'autotest/test_suites/control.PGO_record',
+ 'autotest/test_suites/control.au',
+ 'autotest/test_suites/control.audio',
+ 'autotest/test_suites/control.browsertests',
+ 'autotest/test_suites/control.bvt',
+ 'autotest/test_suites/control.dummy',
+ 'autotest/test_suites/control.enterprise',
+ 'autotest/test_suites/control.enterprise_enroll',
+ 'autotest/test_suites/control.faft_dev',
+ 'autotest/test_suites/control.faft_ec',
+ 'autotest/test_suites/control.faft_normal',
+ 'autotest/test_suites/control.graphics',
+ 'autotest/test_suites/control.graphicsGLES',
+ 'autotest/test_suites/control.hwqual',
+ 'autotest/test_suites/control.kernel_daily_benchmarks',
+ 'autotest/test_suites/control.kernel_daily_regression',
+ 'autotest/test_suites/control.kernel_per-build_benchmarks',
+ 'autotest/test_suites/control.kernel_per-build_regression',
+ 'autotest/test_suites/control.kernel_weekly_regression',
+ 'autotest/test_suites/control.link_perf',
+ 'autotest/test_suites/control.network3g',
+ 'autotest/test_suites/control.network3g_gobi',
+ 'autotest/test_suites/control.network_wifi',
+ 'autotest/test_suites/control.onccell',
+ 'autotest/test_suites/control.pagecycler',
+ 'autotest/test_suites/control.perfalerts',
+ 'autotest/test_suites/control.power_build',
+ 'autotest/test_suites/control.power_daily',
+ 'autotest/test_suites/control.power_requirements',
+ 'autotest/test_suites/control.pyauto',
+ 'autotest/test_suites/control.pyauto_basic',
+ 'autotest/test_suites/control.pyauto_endurance',
+ 'autotest/test_suites/control.pyauto_perf',
+ 'autotest/test_suites/control.regression',
+ 'autotest/test_suites/control.security',
+ 'autotest/test_suites/control.servo',
+ 'autotest/test_suites/control.smoke',
+ 'autotest/test_suites/control.sync',
+ 'autotest/test_suites/control.vda',
+ 'autotest/test_suites/control.video',
+ 'autotest/test_suites/control.webrtc',
+ 'autotest/test_suites/control.wificell',
+ 'autotest/test_suites/control.wifichaos',
+ 'autotest/test_suites/dependency_info',
+ 'autotest/test_suites/dev_harness.py',
+]
+
+_TEST_GOLO_ARCHIVE_IMAGE_ZIPFILE_CONTENT = [
+ 'au-generator.zip',
+ 'boot.config',
+ 'boot.desc',
+ 'chromiumos_qemu_image.bin',
+ 'chromiumos_test_image.bin',
+ 'config.txt',
+ 'mount_image.sh',
+ 'oem.image',
+ 'pack_partitions.sh',
+ 'umount_image.sh',
+ 'unpack_partitions.sh',
+]
+
+
# Different as the above does not have deltas (for smaller artifacts).
_DELTA_VERSION = 'R26-3645.0.0'
_TEST_GOLO_FOR_DELTAS = (
@@ -43,33 +108,52 @@
def tearDown(self):
shutil.rmtree(self.work_dir)
+ def _CheckMarker(self, marker_file, installed_files):
+ with open(os.path.join(self.work_dir, marker_file)) as f:
+ self.assertItemsEqual(installed_files, [line.strip() for line in f])
+
def testProcessBuildArtifact(self):
"""Processes a real tarball from GSUtil and stages it."""
artifact = build_artifact.BuildArtifact(
self.work_dir,
_TEST_GOLO_ARCHIVE, build_artifact.TEST_SUITES_FILE, _VERSION)
artifact.Process(False)
+ self.assertItemsEqual(
+ artifact.installed_files,
+ [os.path.join(self.work_dir, build_artifact.TEST_SUITES_FILE)])
self.assertTrue(os.path.exists(os.path.join(
self.work_dir, build_artifact.TEST_SUITES_FILE)))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testProcessTarball(self):
"""Downloads a real tarball and untars it."""
artifact = build_artifact.TarballBuildArtifact(
self.work_dir, _TEST_GOLO_ARCHIVE, build_artifact.TEST_SUITES_FILE,
_VERSION)
+ expected_installed_files = [
+ os.path.join(self.work_dir, filename)
+ for filename in ([build_artifact.TEST_SUITES_FILE] +
+ _TEST_GOLO_ARCHIVE_TEST_TARBALL_CONTENT)]
artifact.Process(False)
+ self.assertItemsEqual(artifact.installed_files, expected_installed_files)
self.assertTrue(os.path.isdir(os.path.join(
self.work_dir, 'autotest', 'test_suites')))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testProcessTarballWithFile(self):
"""Downloads a real tarball and only untars one file from it."""
file_to_download = 'autotest/test_suites/control.au'
artifact = build_artifact.TarballBuildArtifact(
self.work_dir, _TEST_GOLO_ARCHIVE, build_artifact.TEST_SUITES_FILE,
- _VERSION, [file_to_download])
+ _VERSION, files_to_extract=[file_to_download])
+ expected_installed_files = [
+ os.path.join(self.work_dir, filename)
+ for filename in [build_artifact.TEST_SUITES_FILE] + [file_to_download]]
artifact.Process(False)
+ self.assertItemsEqual(artifact.installed_files, expected_installed_files)
self.assertTrue(os.path.exists(os.path.join(
self.work_dir, file_to_download)))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testDownloadAutotest(self):
"""Downloads a real autotest tarball for test."""
@@ -81,52 +165,81 @@
install_dir = self.work_dir
artifact.staging_dir = install_dir
- artifact._Download()
self.mox.StubOutWithMock(subprocess, 'check_call')
- artifact._Extract()
subprocess.check_call(mox.In('autotest/utils/packager.py'), cwd=install_dir)
+ self.mox.StubOutWithMock(artifact, 'WaitForArtifactToExist')
+ artifact.WaitForArtifactToExist(1)
+ artifact._Download()
+ artifact._Extract()
self.mox.ReplayAll()
- artifact._Setup()
+ artifact.Process(True)
self.mox.VerifyAll()
+ self.assertItemsEqual(artifact.installed_files, [])
self.assertTrue(os.path.isdir(
os.path.join(self.work_dir, 'autotest', 'packages')))
+ self._CheckMarker(artifact.marker_name, [])
def testAUTestPayloadBuildArtifact(self):
"""Downloads a real tarball and treats it like an AU payload."""
artifact = build_artifact.AUTestPayloadBuildArtifact(
self.work_dir, _TEST_GOLO_ARCHIVE, build_artifact.TEST_SUITES_FILE,
_VERSION)
+ expected_installed_files = [
+ os.path.join(self.work_dir, devserver_constants.UPDATE_FILE)]
artifact.Process(False)
+ self.assertItemsEqual(artifact.installed_files, expected_installed_files)
self.assertTrue(os.path.exists(os.path.join(
- self.work_dir, 'update.gz')))
+ self.work_dir, devserver_constants.UPDATE_FILE)))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testDeltaPayloadsArtifact(self):
"""Downloads delta paylaods from test bucket."""
artifact = build_artifact.DeltaPayloadsArtifact(
self.work_dir, _TEST_GOLO_FOR_DELTAS, 'DONTCARE', _DELTA_VERSION)
- artifact.Process(False)
+ delta_installed_files = ('update.gz', 'stateful.tgz')
nton_dir = os.path.join(self.work_dir, 'au', '%s_nton' % _DELTA_VERSION)
mton_dir = os.path.join(self.work_dir, 'au', '%s_mton' % _DELTA_VERSION)
+ expected_installed_files = ([os.path.join(nton_dir, filename)
+ for filename in delta_installed_files] +
+ [os.path.join(mton_dir, filename)
+ for filename in delta_installed_files])
+ artifact.Process(False)
+ self.assertItemsEqual(artifact.installed_files, expected_installed_files)
self.assertTrue(os.path.exists(os.path.join(nton_dir, 'update.gz')))
self.assertTrue(os.path.exists(os.path.join(mton_dir, 'update.gz')))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testImageUnzip(self):
"""Downloads and stages a zip file and extracts a test image."""
+ files_to_extract = ['chromiumos_test_image.bin']
artifact = build_artifact.ZipfileBuildArtifact(
self.work_dir, _TEST_GOLO_ARCHIVE, build_artifact.IMAGE_FILE,
- _VERSION, files_to_extract=['chromiumos_test_image.bin'])
+ _VERSION, files_to_extract=files_to_extract)
+ expected_installed_files = [
+ os.path.join(self.work_dir, filename)
+ for filename in [build_artifact.IMAGE_FILE] + files_to_extract]
artifact.Process(False)
+ self.assertItemsEqual(expected_installed_files, artifact.installed_files)
self.assertTrue(os.path.exists(os.path.join(
self.work_dir, 'chromiumos_test_image.bin')))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testImageUnzipWithExcludes(self):
"""Downloads and stages a zip file while excluding all large files."""
artifact = build_artifact.ZipfileBuildArtifact(
self.work_dir, _TEST_GOLO_ARCHIVE, build_artifact.IMAGE_FILE,
_VERSION, exclude=['*.bin'])
+ expected_extracted_files = [
+ filename for filename in _TEST_GOLO_ARCHIVE_IMAGE_ZIPFILE_CONTENT
+ if not filename.endswith('.bin')]
+ expected_installed_files = [
+ os.path.join(self.work_dir, filename)
+ for filename in [build_artifact.IMAGE_FILE] + expected_extracted_files]
artifact.Process(False)
+ self.assertItemsEqual(expected_installed_files, artifact.installed_files)
self.assertFalse(os.path.exists(os.path.join(
self.work_dir, 'chromiumos_test_image.bin')))
+ self._CheckMarker(artifact.marker_name, artifact.installed_files)
def testArtifactFactory(self):
"""Tests that BuildArtifact logic works for both named and file artifacts.
@@ -138,14 +251,25 @@
_VERSION)
artifacts = factory.RequiredArtifacts()
self.assertEqual(len(artifacts), 2)
+ expected_installed_files_0 = [
+ os.path.join(self.work_dir, filename) for filename
+ in ([build_artifact.TEST_SUITES_FILE] +
+ _TEST_GOLO_ARCHIVE_TEST_TARBALL_CONTENT)]
+ expected_installed_files_1 = [os.path.join(self.work_dir, file_artifact)]
artifacts[0].Process(False)
artifacts[1].Process(False)
+ self.assertItemsEqual(artifacts[0].installed_files,
+ expected_installed_files_0)
+ self.assertItemsEqual(artifacts[1].installed_files,
+ expected_installed_files_1)
# Test suites directory exists.
self.assertTrue(os.path.exists(os.path.join(
self.work_dir, 'autotest', 'test_suites')))
# File artifact was staged.
self.assertTrue(os.path.exists(os.path.join(self.work_dir,
file_artifact)))
+ self._CheckMarker(artifacts[0].marker_name, artifacts[0].installed_files)
+ self._CheckMarker(artifacts[1].marker_name, artifacts[1].installed_files)
def testProcessBuildArtifactWithException(self):
"""Test processing a non-existing artifact from GSUtil."""
@@ -159,6 +283,29 @@
exception = artifact.GetException()
self.assertEqual(str(exception), str(expected_exception))
+ def testArtifactStaged(self):
+ """Tests the artifact staging verification logic."""
+ artifact = build_artifact.TarballBuildArtifact(
+ self.work_dir, _TEST_GOLO_ARCHIVE, build_artifact.TEST_SUITES_FILE,
+ _VERSION)
+ expected_installed_files = [
+ os.path.join(self.work_dir, filename)
+ for filename in ([build_artifact.TEST_SUITES_FILE] +
+ _TEST_GOLO_ARCHIVE_TEST_TARBALL_CONTENT)]
+ artifact.Process(False)
+
+ # Check that it works when all files are there.
+ self.assertTrue(artifact.ArtifactStaged())
+
+ # Remove an arbitrary file among the ones staged, ensure the check fails
+ # and that the marker files is removed.
+ os.remove(random.choice(expected_installed_files))
+ self.assertTrue(os.path.exists(os.path.join(self.work_dir,
+ artifact.marker_name)))
+ self.assertFalse(artifact.ArtifactStaged())
+ self.assertFalse(os.path.exists(os.path.join(self.work_dir,
+ artifact.marker_name)))
+
if __name__ == '__main__':
unittest.main()
diff --git a/common_util.py b/common_util.py
index c6a317c..e516fd0 100644
--- a/common_util.py
+++ b/common_util.py
@@ -322,7 +322,7 @@
def ExtractTarball(tarball_path, install_path, files_to_extract=None,
- excluded_files=None):
+ excluded_files=None, return_extracted_files=False):
"""Extracts a tarball using tar.
Detects whether the tarball is compressed or not based on the file
@@ -333,10 +333,18 @@
install_path: Path to extract the tarball to.
files_to_extract: String of specific files in the tarball to extract.
excluded_files: String of files to not extract.
+ return_extracted_file: Whether or not the caller expects the list of
+ files extracted; if False, returns an empty list.
+ Returns:
+ List of absolute paths of the files extracted (possibly empty).
"""
# Deal with exclusions.
cmd = ['tar', 'xf', tarball_path, '--directory', install_path]
+ # If caller requires the list of extracted files, get verbose.
+ if return_extracted_files:
+ cmd += ['--verbose']
+
# Determine how to decompress.
tarball = os.path.basename(tarball_path)
if tarball.endswith('.tar.bz2'):
@@ -352,7 +360,12 @@
cmd.extend(files_to_extract)
try:
- subprocess.check_call(cmd)
+ cmd_output = subprocess.check_output(cmd)
+ if return_extracted_files:
+ return [os.path.join(install_path, filename)
+ for filename in cmd_output.strip('\n').splitlines()
+ if not filename.endswith('/')]
+ return []
except subprocess.CalledProcessError, e:
raise CommonUtilError(
'An error occurred when attempting to untar %s:\n%s' %