protected function checkFiles(LocalRepo $repo, array $paths, $verbose) { if (!count($paths)) { return; } $dbr = $repo->getSlaveDB(); $curNames = []; $oldNames = []; $imgIN = []; $oiWheres = []; foreach ($paths as $path) { $name = basename($path); if (preg_match('#^archive/#', $path)) { if ($verbose) { $this->output("Checking old file {$name}\n"); } $oldNames[] = $name; list(, $base) = explode('!', $name, 2); // <TS_MW>!<img_name> $oiWheres[] = $dbr->makeList(['oi_name' => $base, 'oi_archive_name' => $name], LIST_AND); } else { if ($verbose) { $this->output("Checking current file {$name}\n"); } $curNames[] = $name; $imgIN[] = $name; } } $res = $dbr->query($dbr->unionQueries([$dbr->selectSQLText('image', ['name' => 'img_name', 'old' => 0], $imgIN ? ['img_name' => $imgIN] : '1=0'), $dbr->selectSQLText('oldimage', ['name' => 'oi_archive_name', 'old' => 1], $oiWheres ? $dbr->makeList($oiWheres, LIST_OR) : '1=0')], true), __METHOD__); $curNamesFound = []; $oldNamesFound = []; foreach ($res as $row) { if ($row->old) { $oldNamesFound[] = $row->name; } else { $curNamesFound[] = $row->name; } } foreach (array_diff($curNames, $curNamesFound) as $name) { $file = $repo->newFile($name); // Print name and public URL to ease recovery if ($file) { $this->output($name . "\n" . $file->getCanonicalUrl() . "\n\n"); } else { $this->error("Cannot get URL for bad file title '{$name}'"); } } foreach (array_diff($oldNames, $oldNamesFound) as $name) { list(, $base) = explode('!', $name, 2); // <TS_MW>!<img_name> $file = $repo->newFromArchiveName(Title::makeTitle(NS_FILE, $base), $name); // Print name and public URL to ease recovery $this->output($name . "\n" . $file->getCanonicalUrl() . "\n\n"); } }
/** * Create a OldLocalFile from a SHA-1 key * Do not call this except from inside a repo class. * * @param string $sha1 Base-36 SHA-1 * @param LocalRepo $repo * @param string|bool $timestamp MW_timestamp (optional) * * @return bool|OldLocalFile */ static function newFromKey($sha1, $repo, $timestamp = false) { $dbr = $repo->getSlaveDB(); $conds = array('oi_sha1' => $sha1); if ($timestamp) { $conds['oi_timestamp'] = $dbr->timestamp($timestamp); } $row = $dbr->selectRow('oldimage', self::selectFields(), $conds, __METHOD__); if ($row) { return self::newFromRow($row, $repo); } else { return false; } }
/** * @param $info array|null */ function __construct($info) { parent::__construct($info); $this->wiki = $info['wiki']; list($this->dbName, $this->tablePrefix) = wfSplitWikiID($this->wiki); $this->hasSharedCache = $info['hasSharedCache']; }
/** * Helper function: do the actual database query to fetch file metadata. * * @param string $key key * @param $readFromDB: constant (default: DB_SLAVE) * @return boolean */ protected function fetchFileMetadata( $key, $readFromDB = DB_SLAVE ) { // populate $fileMetadata[$key] $dbr = null; if ( $readFromDB === DB_MASTER ) { // sometimes reading from the master is necessary, if there's replication lag. $dbr = $this->repo->getMasterDb(); } else { $dbr = $this->repo->getSlaveDb(); } $row = $dbr->selectRow( 'uploadstash', '*', array( 'us_key' => $key ), __METHOD__ ); if ( !is_object( $row ) ) { // key wasn't present in the database. this will happen sometimes. return false; } $this->fileMetadata[$key] = (array)$row; return true; }
/** * Get the HTML text of the description page, if available * * @return string */ function getDescriptionText() { global $wgMemc, $wgLang; if (!$this->repo->fetchDescription) { return false; } $renderUrl = $this->repo->getDescriptionRenderUrl($this->getName(), $wgLang->getCode()); if ($renderUrl) { if ($this->repo->descriptionCacheExpiry > 0) { wfDebug("Attempting to get the description from cache..."); $key = $this->repo->getLocalCacheKey('RemoteFileDescription', 'url', $wgLang->getCode(), $this->getName()); $obj = $wgMemc->get($key); if ($obj) { wfDebug("success!\n"); return $obj; } wfDebug("miss\n"); } wfDebug("Fetching shared description from {$renderUrl}\n"); $res = Http::get($renderUrl); if ($res && $this->repo->descriptionCacheExpiry > 0) { $wgMemc->set($key, $res, $this->repo->descriptionCacheExpiry); } return $res; } else { return false; } }
static function getHashPathForLevel($name, $levels) { global $wgContLang; $bits = explode(':', $name); $filename = $bits[count($bits) - 1]; $path = parent::getHashPathForLevel($filename, $levels); return count($bits) > 1 ? $wgContLang->getNsIndex($bits[0]) . '/' . $path : $path; }
public function execute() { if (!$this->hasOption('delete')) { $this->output("Use --delete to actually confirm this script\n"); return; } # Data should come off the master, wrapped in a transaction $dbw = $this->getDB(DB_MASTER); $dbw->begin(__METHOD__); $repo = RepoGroup::singleton()->getLocalRepo(); # Get "active" revisions from the filearchive table $this->output("Searching for and deleting archived files...\n"); $res = $dbw->select('filearchive', array('fa_id', 'fa_storage_group', 'fa_storage_key', 'fa_sha1'), '', __METHOD__); $count = 0; foreach ($res as $row) { $key = $row->fa_storage_key; if (!strlen($key)) { $this->output("Entry with ID {$row->fa_id} has empty key, skipping\n"); continue; } $group = $row->fa_storage_group; $id = $row->fa_id; $path = $repo->getZonePath('deleted') . '/' . $repo->getDeletedHashPath($key) . $key; if (isset($row->fa_sha1)) { $sha1 = $row->fa_sha1; } else { // old row, populate from key $sha1 = LocalRepo::getHashFromKey($key); } // Check if the file is used anywhere... $inuse = $dbw->selectField('oldimage', '1', array('oi_sha1' => $sha1, $dbw->bitAnd('oi_deleted', File::DELETED_FILE) => File::DELETED_FILE), __METHOD__, array('FOR UPDATE')); $needForce = true; if (!$repo->fileExists($path)) { $this->output("Notice - file '{$key}' not found in group '{$group}'\n"); } elseif ($inuse) { $this->output("Notice - file '{$key}' is still in use\n"); } elseif (!$repo->quickPurge($path)) { $this->output("Unable to remove file {$path}, skipping\n"); continue; // don't delete even with --force } else { $needForce = false; } if ($needForce) { if ($this->hasOption('force')) { $this->output("Got --force, deleting DB entry\n"); } else { continue; } } $count++; $dbw->delete('filearchive', array('fa_id' => $id), __METHOD__); } $dbw->commit(__METHOD__); $this->output("Done! [{$count} file(s)]\n"); }
/** * Find all instances of files with this key * * @param $hash String base 36 SHA-1 hash * @return Array of File objects */ function findBySha1($hash) { if (!$this->reposInitialised) { $this->initialiseRepos(); } $result = $this->localRepo->findBySha1($hash); foreach ($this->foreignRepos as $repo) { $result = array_merge($result, $repo->findBySha1($hash)); } return $result; }
/** * @param $info array|null */ function __construct( $info ) { parent::__construct( $info ); $this->dbType = $info['dbType']; $this->dbServer = $info['dbServer']; $this->dbUser = $info['dbUser']; $this->dbPassword = $info['dbPassword']; $this->dbName = $info['dbName']; $this->dbFlags = $info['dbFlags']; $this->tablePrefix = $info['tablePrefix']; $this->hasSharedCache = $info['hasSharedCache']; }
function __construct($info) { parent::__construct($info); // Required settings $this->directory = $info['directory']; $this->url = $info['url']; $this->hashLevels = $info['hashLevels']; if (isset($info['cache'])) { $this->cache = getcwd() . '/images/' . $info['cache']; } }
protected function checkFiles(LocalRepo $repo, array $names, $verbose) { if (!count($names)) { return; } $dbr = $repo->getSlaveDB(); $imgIN = array(); $oiWheres = array(); foreach ($names as $name) { if (strpos($name, '!') !== false) { if ($verbose) { $this->output("Checking old file {$name}\n"); } list(, $base) = explode('!', $name); // <TS_MW>!<img_name> $oiWheres[] = $dbr->makeList(array('oi_name' => $base, 'oi_archive_name' => $name), LIST_AND); } else { if ($verbose) { $this->output("Checking current file {$name}\n"); } $imgIN[] = $name; } } $res = $dbr->query($dbr->unionQueries(array($dbr->selectSQLText('image', array('name' => 'img_name'), array('img_name' => $imgIN)), $dbr->selectSQLText('oldimage', array('name' => 'oi_archive_name'), $dbr->makeList($oiWheres, LIST_OR))), true), __METHOD__); $namesFound = array(); foreach ($res as $row) { $namesFound[] = $row->name; } $namesOrphans = array_diff($names, $namesFound); foreach ($namesOrphans as $name) { // Print name and public URL to ease recovery if (strpos($name, '!') !== false) { list(, $base) = explode('!', $name); // <TS_MW>!<img_name> $file = $repo->newFromArchiveName(Title::makeTitle(NS_FILE, $base), $name); } else { $file = $repo->newFile($name); } $this->output($name . "\n" . $file->getUrl() . "\n\n"); } }
/** * Helper function: do the actual database query to fetch file metadata. * * @param $key String: key * @return boolean */ protected function fetchFileMetadata($key) { // populate $fileMetadata[$key] $dbr = $this->repo->getSlaveDb(); $row = $dbr->selectRow('uploadstash', '*', array('us_key' => $key), __METHOD__); if (!is_object($row)) { // key wasn't present in the database. this will happen sometimes. return false; } $this->fileMetadata[$key] = array('us_user' => $row->us_user, 'us_key' => $row->us_key, 'us_orig_path' => $row->us_orig_path, 'us_path' => $row->us_path, 'us_size' => $row->us_size, 'us_sha1' => $row->us_sha1, 'us_mime' => $row->us_mime, 'us_media_type' => $row->us_media_type, 'us_image_width' => $row->us_image_width, 'us_image_height' => $row->us_image_height, 'us_image_bits' => $row->us_image_bits, 'us_source_type' => $row->us_source_type, 'us_timestamp' => $row->us_timestamp, 'us_status' => $row->us_status); return true; }
/** * Find all instances of files with this keys * * @param array $hashes base 36 SHA-1 hashes * @return array of array of File objects */ function findBySha1s(array $hashes) { if (!$this->reposInitialised) { $this->initialiseRepos(); } $result = $this->localRepo->findBySha1s($hashes); foreach ($this->foreignRepos as $repo) { $result = array_merge_recursive($result, $repo->findBySha1s($hashes)); } //sort the merged (and presorted) sublist of each hash foreach ($result as $hash => $files) { usort($result[$hash], 'File::compare'); } return $result; }
/** * Find all instances of files with this key * * @param $hash String base 36 SHA-1 hash * @return Array of File objects */ function findBySha1($hash) { if (!$this->reposInitialised) { $this->initialiseRepos(); } $result = $this->localRepo->findBySha1($hash); foreach ($this->foreignRepos as $repo) { // Wikia Change if (empty($repo->checkDuplicates)) { continue; } // Wikia Change End $result = array_merge($result, $repo->findBySha1($hash)); } return $result; }
/** * Override handling of action=purge * @return bool */ public function doPurge() { $this->loadFile(); if ($this->mFile->exists()) { wfDebug('ImagePage::doPurge purging ' . $this->mFile->getName() . "\n"); DeferredUpdates::addUpdate(new HTMLCacheUpdate($this->mTitle, 'imagelinks')); $this->mFile->purgeCache(['forThumbRefresh' => true]); } else { wfDebug('ImagePage::doPurge no image for ' . $this->mFile->getName() . "; limiting purge to cache only\n"); // even if the file supposedly doesn't exist, force any cached information // to be updated (in case the cached information is wrong) $this->mFile->purgeCache(['forThumbRefresh' => true]); } if ($this->mRepo) { // Purge redirect cache $this->mRepo->invalidateImageRedirect($this->mTitle); } return parent::doPurge(); }
public function doDBUpdates() { $startTime = microtime(true); $dbw = wfGetDB(DB_MASTER); $table = 'filearchive'; $conds = array('fa_sha1' => '', 'fa_storage_key IS NOT NULL'); if (!$dbw->fieldExists($table, 'fa_sha1', __METHOD__)) { $this->output("fa_sha1 column does not exist\n\n", true); return false; } $this->output("Populating fa_sha1 field from fa_storage_key\n"); $endId = $dbw->selectField($table, 'MAX(fa_id)', false, __METHOD__); $batchSize = $this->mBatchSize; $done = 0; do { $res = $dbw->select($table, array('fa_id', 'fa_storage_key'), $conds, __METHOD__, array('LIMIT' => $batchSize)); $i = 0; foreach ($res as $row) { if ($row->fa_storage_key == '') { // Revision was missing pre-deletion continue; } $sha1 = LocalRepo::getHashFromKey($row->fa_storage_key); $dbw->update($table, array('fa_sha1' => $sha1), array('fa_id' => $row->fa_id), __METHOD__); $lastId = $row->fa_id; $i++; } $done += $i; if ($i !== $batchSize) { break; } // print status and let slaves catch up $this->output(sprintf("id %d done (up to %d), %5.3f%% \r", $lastId, $endId, $lastId / $endId * 100)); wfWaitForSlaves(); } while (true); $processingTime = microtime(true) - $startTime; $this->output(sprintf("\nDone %d files in %.1f seconds\n", $done, $processingTime)); return true; // we only updated *some* files, don't log }
/** * Output the chunk to disk * * @param string $chunkPath * @throws UploadChunkFileException * @return FileRepoStatus */ private function outputChunk($chunkPath) { // Key is fileKey + chunk index $fileKey = $this->getChunkFileKey(); // Store the chunk per its indexed fileKey: $hashPath = $this->repo->getHashPath($fileKey); $storeStatus = $this->repo->quickImport($chunkPath, $this->repo->getZonePath('temp') . "/{$hashPath}{$fileKey}"); // Check for error in stashing the chunk: if (!$storeStatus->isOK()) { $error = $storeStatus->getErrorsArray(); $error = reset($error); if (!count($error)) { $error = $storeStatus->getWarningsArray(); $error = reset($error); if (!count($error)) { $error = array('unknown', 'no error recorded'); } } throw new UploadChunkFileException("Error storing file in '{$chunkPath}': " . implode('; ', $error)); } return $storeStatus; }
function newFile( $title, $time = false ) { if ( empty( $title ) ) { return null; } return parent::newFile( $title, $time ); }
/** * Load ArchivedFile object fields from a DB row. * * @param stdClass $row Object database row * @since 1.21 */ public function loadFromRow($row) { $this->id = intval($row->fa_id); $this->name = $row->fa_name; $this->archive_name = $row->fa_archive_name; $this->group = $row->fa_storage_group; $this->key = $row->fa_storage_key; $this->size = $row->fa_size; $this->bits = $row->fa_bits; $this->width = $row->fa_width; $this->height = $row->fa_height; $this->metadata = $row->fa_metadata; $this->mime = "{$row->fa_major_mime}/{$row->fa_minor_mime}"; $this->media_type = $row->fa_media_type; $this->description = $row->fa_description; $this->user = $row->fa_user; $this->user_text = $row->fa_user_text; $this->timestamp = $row->fa_timestamp; $this->deleted = $row->fa_deleted; if (isset($row->fa_sha1)) { $this->sha1 = $row->fa_sha1; } else { // old row, populate from key $this->sha1 = LocalRepo::getHashFromKey($this->key); } }
/** * Get the SHA-1 base 36 hash of the file * * @return string */ function getSha1() { $this->assertRepoDefined(); return $this->repo->getFileSha1($this->getPath()); }
public function execute() { $user = $this->getUser(); // Before doing anything at all, let's check permissions if (!$user->isAllowed('deletedhistory')) { $this->dieUsage('You don\'t have permission to view deleted file information', 'permissiondenied'); } $db = $this->getDB(); $params = $this->extractRequestParams(); $prop = array_flip($params['prop']); $fld_sha1 = isset($prop['sha1']); $fld_timestamp = isset($prop['timestamp']); $fld_user = isset($prop['user']); $fld_size = isset($prop['size']); $fld_dimensions = isset($prop['dimensions']); $fld_description = isset($prop['description']) || isset($prop['parseddescription']); $fld_mime = isset($prop['mime']); $fld_mediatype = isset($prop['mediatype']); $fld_metadata = isset($prop['metadata']); $fld_bitdepth = isset($prop['bitdepth']); $fld_archivename = isset($prop['archivename']); $this->addTables('filearchive'); $this->addFields(array('fa_name', 'fa_deleted')); $this->addFieldsIf('fa_storage_key', $fld_sha1); $this->addFieldsIf('fa_timestamp', $fld_timestamp); $this->addFieldsIf(array('fa_user', 'fa_user_text'), $fld_user); $this->addFieldsIf(array('fa_height', 'fa_width', 'fa_size'), $fld_dimensions || $fld_size); $this->addFieldsIf('fa_description', $fld_description); $this->addFieldsIf(array('fa_major_mime', 'fa_minor_mime'), $fld_mime); $this->addFieldsIf('fa_media_type', $fld_mediatype); $this->addFieldsIf('fa_metadata', $fld_metadata); $this->addFieldsIf('fa_bits', $fld_bitdepth); $this->addFieldsIf('fa_archive_name', $fld_archivename); if (!is_null($params['continue'])) { $cont = explode('|', $params['continue']); if (count($cont) != 1) { $this->dieUsage("Invalid continue param. You should pass the " . "original value returned by the previous query", "_badcontinue"); } $op = $params['dir'] == 'descending' ? '<' : '>'; $cont_from = $db->addQuotes($cont[0]); $this->addWhere("fa_name {$op}= {$cont_from}"); } // Image filters $dir = $params['dir'] == 'descending' ? 'older' : 'newer'; $from = is_null($params['from']) ? null : $this->titlePartToKey($params['from']); if (!is_null($params['continue'])) { $from = $params['continue']; } $to = is_null($params['to']) ? null : $this->titlePartToKey($params['to']); $this->addWhereRange('fa_name', $dir, $from, $to); if (isset($params['prefix'])) { $this->addWhere('fa_name' . $db->buildLike($this->titlePartToKey($params['prefix']), $db->anyString())); } $sha1Set = isset($params['sha1']); $sha1base36Set = isset($params['sha1base36']); if ($sha1Set || $sha1base36Set) { global $wgMiserMode; if ($wgMiserMode) { $this->dieUsage('Search by hash disabled in Miser Mode', 'hashsearchdisabled'); } $sha1 = false; if ($sha1Set) { if (!$this->validateSha1Hash($params['sha1'])) { $this->dieUsage('The SHA1 hash provided is not valid', 'invalidsha1hash'); } $sha1 = wfBaseConvert($params['sha1'], 16, 36, 31); } elseif ($sha1base36Set) { if (!$this->validateSha1Base36Hash($params['sha1base36'])) { $this->dieUsage('The SHA1Base36 hash provided is not valid', 'invalidsha1base36hash'); } $sha1 = $params['sha1base36']; } if ($sha1) { $this->addWhere('fa_storage_key ' . $db->buildLike("{$sha1}.", $db->anyString())); } } if (!$user->isAllowed('suppressrevision')) { // Filter out revisions that the user is not allowed to see. There // is no way to indicate that we have skipped stuff because the // continuation parameter is fa_name // Note that this field is unindexed. This should however not be // a big problem as files with fa_deleted are rare $this->addWhereFld('fa_deleted', 0); } $limit = $params['limit']; $this->addOption('LIMIT', $limit + 1); $sort = $params['dir'] == 'descending' ? ' DESC' : ''; $this->addOption('ORDER BY', 'fa_name' . $sort); $res = $this->select(__METHOD__); $count = 0; $result = $this->getResult(); foreach ($res as $row) { if (++$count > $limit) { // We've reached the one extra which shows that there are additional pages to be had. Stop here... $this->setContinueEnumParameter('continue', $row->fa_name); break; } $file = array(); $file['name'] = $row->fa_name; $title = Title::makeTitle(NS_FILE, $row->fa_name); self::addTitleInfo($file, $title); if ($fld_sha1) { $file['sha1'] = wfBaseConvert(LocalRepo::getHashFromKey($row->fa_storage_key), 36, 16, 40); } if ($fld_timestamp) { $file['timestamp'] = wfTimestamp(TS_ISO_8601, $row->fa_timestamp); } if ($fld_user) { $file['userid'] = $row->fa_user; $file['user'] = $row->fa_user_text; } if ($fld_size || $fld_dimensions) { $file['size'] = $row->fa_size; $pageCount = ArchivedFile::newFromRow($row)->pageCount(); if ($pageCount !== false) { $vals['pagecount'] = $pageCount; } $file['height'] = $row->fa_height; $file['width'] = $row->fa_width; } if ($fld_description) { $file['description'] = $row->fa_description; if (isset($prop['parseddescription'])) { $file['parseddescription'] = Linker::formatComment($row->fa_description, $title); } } if ($fld_mediatype) { $file['mediatype'] = $row->fa_media_type; } if ($fld_metadata) { $file['metadata'] = $row->fa_metadata ? ApiQueryImageInfo::processMetaData(unserialize($row->fa_metadata), $result) : null; } if ($fld_bitdepth) { $file['bitdepth'] = $row->fa_bits; } if ($fld_mime) { $file['mime'] = "{$row->fa_major_mime}/{$row->fa_minor_mime}"; } if ($fld_archivename && !is_null($row->fa_archive_name)) { $file['archivename'] = $row->fa_archive_name; } if ($row->fa_deleted & File::DELETED_FILE) { $file['filehidden'] = ''; } if ($row->fa_deleted & File::DELETED_COMMENT) { $file['commenthidden'] = ''; } if ($row->fa_deleted & File::DELETED_USER) { $file['userhidden'] = ''; } if ($row->fa_deleted & File::DELETED_RESTRICTED) { // This file is deleted for normal admins $file['suppressed'] = ''; } $fit = $result->addValue(array('query', $this->getModuleName()), null, $file); if (!$fit) { $this->setContinueEnumParameter('continue', $row->fa_name); break; } } $result->setIndexedTagName_internal(array('query', $this->getModuleName()), 'fa'); }
/** * Roll back the DB transaction and mark the image unlocked */ function unlockAndRollback() { $this->locked = false; $dbw = $this->repo->getMasterDB(); $dbw->rollback(__METHOD__); }
/** * Run the transaction, except the cleanup batch. * The cleanup batch should be run in a separate transaction, because it locks different * rows and there's no need to keep the image row locked while it's acquiring those locks * The caller may have its own transaction open. * So we save the batch and let the caller call cleanup() * @return FileRepoStatus */ function execute() { global $wgLang; if (!$this->all && !$this->ids) { // Do nothing return $this->file->repo->newGood(); } $exists = $this->file->lock(); $dbw = $this->file->repo->getMasterDB(); $status = $this->file->repo->newGood(); // Fetch all or selected archived revisions for the file, // sorted from the most recent to the oldest. $conditions = array('fa_name' => $this->file->getName()); if (!$this->all) { $conditions['fa_id'] = $this->ids; } $result = $dbw->select('filearchive', ArchivedFile::selectFields(), $conditions, __METHOD__, array('ORDER BY' => 'fa_timestamp DESC')); $idsPresent = array(); $storeBatch = array(); $insertBatch = array(); $insertCurrent = false; $deleteIds = array(); $first = true; $archiveNames = array(); foreach ($result as $row) { $idsPresent[] = $row->fa_id; if ($row->fa_name != $this->file->getName()) { $status->error('undelete-filename-mismatch', $wgLang->timeanddate($row->fa_timestamp)); $status->failCount++; continue; } if ($row->fa_storage_key == '') { // Revision was missing pre-deletion $status->error('undelete-bad-store-key', $wgLang->timeanddate($row->fa_timestamp)); $status->failCount++; continue; } $deletedRel = $this->file->repo->getDeletedHashPath($row->fa_storage_key) . $row->fa_storage_key; $deletedUrl = $this->file->repo->getVirtualUrl() . '/deleted/' . $deletedRel; if (isset($row->fa_sha1)) { $sha1 = $row->fa_sha1; } else { // old row, populate from key $sha1 = LocalRepo::getHashFromKey($row->fa_storage_key); } # Fix leading zero if (strlen($sha1) == 32 && $sha1[0] == '0') { $sha1 = substr($sha1, 1); } if (is_null($row->fa_major_mime) || $row->fa_major_mime == 'unknown' || is_null($row->fa_minor_mime) || $row->fa_minor_mime == 'unknown' || is_null($row->fa_media_type) || $row->fa_media_type == 'UNKNOWN' || is_null($row->fa_metadata)) { // Refresh our metadata // Required for a new current revision; nice for older ones too. :) $props = RepoGroup::singleton()->getFileProps($deletedUrl); } else { $props = array('minor_mime' => $row->fa_minor_mime, 'major_mime' => $row->fa_major_mime, 'media_type' => $row->fa_media_type, 'metadata' => $row->fa_metadata); } if ($first && !$exists) { // This revision will be published as the new current version $destRel = $this->file->getRel(); $insertCurrent = array('img_name' => $row->fa_name, 'img_size' => $row->fa_size, 'img_width' => $row->fa_width, 'img_height' => $row->fa_height, 'img_metadata' => $props['metadata'], 'img_bits' => $row->fa_bits, 'img_media_type' => $props['media_type'], 'img_major_mime' => $props['major_mime'], 'img_minor_mime' => $props['minor_mime'], 'img_description' => $row->fa_description, 'img_user' => $row->fa_user, 'img_user_text' => $row->fa_user_text, 'img_timestamp' => $row->fa_timestamp, 'img_sha1' => $sha1); // The live (current) version cannot be hidden! if (!$this->unsuppress && $row->fa_deleted) { $status->fatal('undeleterevdel'); $this->file->unlock(); return $status; } } else { $archiveName = $row->fa_archive_name; if ($archiveName == '') { // This was originally a current version; we // have to devise a new archive name for it. // Format is <timestamp of archiving>!<name> $timestamp = wfTimestamp(TS_UNIX, $row->fa_deleted_timestamp); do { $archiveName = wfTimestamp(TS_MW, $timestamp) . '!' . $row->fa_name; $timestamp++; } while (isset($archiveNames[$archiveName])); } $archiveNames[$archiveName] = true; $destRel = $this->file->getArchiveRel($archiveName); $insertBatch[] = array('oi_name' => $row->fa_name, 'oi_archive_name' => $archiveName, 'oi_size' => $row->fa_size, 'oi_width' => $row->fa_width, 'oi_height' => $row->fa_height, 'oi_bits' => $row->fa_bits, 'oi_description' => $row->fa_description, 'oi_user' => $row->fa_user, 'oi_user_text' => $row->fa_user_text, 'oi_timestamp' => $row->fa_timestamp, 'oi_metadata' => $props['metadata'], 'oi_media_type' => $props['media_type'], 'oi_major_mime' => $props['major_mime'], 'oi_minor_mime' => $props['minor_mime'], 'oi_deleted' => $this->unsuppress ? 0 : $row->fa_deleted, 'oi_sha1' => $sha1); } $deleteIds[] = $row->fa_id; if (!$this->unsuppress && $row->fa_deleted & File::DELETED_FILE) { // private files can stay where they are $status->successCount++; } else { $storeBatch[] = array($deletedUrl, 'public', $destRel); $this->cleanupBatch[] = $row->fa_storage_key; } $first = false; } unset($result); // Add a warning to the status object for missing IDs $missingIds = array_diff($this->ids, $idsPresent); foreach ($missingIds as $id) { $status->error('undelete-missing-filearchive', $id); } // Remove missing files from batch, so we don't get errors when undeleting them $storeBatch = $this->removeNonexistentFiles($storeBatch); // Run the store batch // Use the OVERWRITE_SAME flag to smooth over a common error $storeStatus = $this->file->repo->storeBatch($storeBatch, FileRepo::OVERWRITE_SAME); $status->merge($storeStatus); if (!$status->isGood()) { // Even if some files could be copied, fail entirely as that is the // easiest thing to do without data loss $this->cleanupFailedBatch($storeStatus, $storeBatch); $status->ok = false; $this->file->unlock(); return $status; } // Run the DB updates // Because we have locked the image row, key conflicts should be rare. // If they do occur, we can roll back the transaction at this time with // no data loss, but leaving unregistered files scattered throughout the // public zone. // This is not ideal, which is why it's important to lock the image row. if ($insertCurrent) { $dbw->insert('image', $insertCurrent, __METHOD__); } if ($insertBatch) { $dbw->insert('oldimage', $insertBatch, __METHOD__); } if ($deleteIds) { $dbw->delete('filearchive', array('fa_id' => $deleteIds), __METHOD__); } // If store batch is empty (all files are missing), deletion is to be considered successful if ($status->successCount > 0 || !$storeBatch) { if (!$exists) { wfDebug(__METHOD__ . " restored {$status->successCount} items, creating a new current\n"); DeferredUpdates::addUpdate(SiteStatsUpdate::factory(array('images' => 1))); $this->file->purgeEverything(); } else { wfDebug(__METHOD__ . " restored {$status->successCount} as archived versions\n"); $this->file->purgeDescription(); $this->file->purgeHistory(); } } $this->file->unlock(); return $status; }
protected function getDeletedPath(LocalRepo $repo, LocalFile $file) { $hash = $repo->getFileSha1($file->getPath()); $key = "{$hash}.{$file->getExtension()}"; return $repo->getDeletedHashPath($key) . $key; }