Download depot files into local disk without a client workspace

6.3k Views Asked by At

I read one other post here that I can download a file from the Perforce depot into a local disk without a client workspace. To extend that further, I need to download all files (text & binary) from a depot dir into my local disk. Is this the correct p4 command to do that?

p4 print //depot/dir1/...

I have a few questions:

  1. Will it download all sub directories as well as files from //depot/dir1/... or files only?
  2. Will it keep the original names of the files that will be downloaded?
  3. Where are the files located on the local disk given no local path is specified?

I'm using the p4api.net library. Will this code do it?

    public void GetFiles(string DepotFilePath) 
    {
        P4Command cmd = new P4Command( _repository, "print", true, 
        String.Format( "{0}/...", DepotFilePath ));

        results = cmd.Run();
        if (results.Success)
        {
            //do something here
        }
    }

I'm not sure where in the local disk it will dump the files into?

Thank you for your help in advance.

5

There are 5 best solutions below

0
On
0
On

One-Liner

For reference, here's a bash one liner that will do the manual procedure jhwist's answer:

for _file in $(p4 files //depot/dir/... | awk '{print $1}' | perl -ne '/(.*)#\d+$/ && print "$1\n"'); do p4 print -q $_file > /path/to/target/dir/$(basename $_file); done

The only bits you have to replace with the directories in question are //depot/dir and /path/to/target/dir. Note: the target directory has to already exist.


Explanation:

Pulling out what the for loop is iterating over:

$(p4 files //depot/dir/... | awk '{print $1}' | perl -ne '/(.*)#\d+$/ && print "$1\n"')
  1. Get list of files from perforce directory in question

  2. First column of the output is depot location of file, so extract it with awk

  3. Depot location has #revisionNumber tacked on the end of it, so strip it off with perl

0
On

Simple solution requiring no platform-specific scripting tools:

p4 client -df TEMP_CLIENT
p4 -c TEMP_CLIENT --field View="//depot/dir1/... //TEMP_CLIENT/..." client -o | p4 client -i
p4 -c TEMP_CLIENT sync -p
p4 client -d TEMP_CLIENT

This will download all files/directories from //depot/dir1 into your current directory. If you want to specify a different directory, add --field "Root=C:\Some Path" to the first command in the same spot where the View is specified.

0
On

So I've spent some time to get this to work under different scenarios in bash, and even though it is not dotnet specific I thought I'd share my results, as the concepts are universal.

You have two options to get files from remote:

  1. Check out and sync a workspace mapping using a temporary client/workspace, like Sam Stafford wrote in his answer. Here is that in bash with comments:
# p4 uses this global variable as client/workspace name by default.
# Alternatively, you can also use -c in every command.
P4CLIENT="TEMP_SCRIPT_CLIENT"

# mapping remote folder to relative workspace folder  (recursive)
wokspaceMapping="//depot/dir1/... //$P4CLIENT/..."
p4 client -d $P4CLIENT # make sure the workspace does not exist already
# Creating a client on the console gives an editor popup to confirm the
# workspace specification, unless you provide a specification on stdin.
# You can however generate one to stdout, and pipe the result to stdin again.
p4 --field View="$wokspaceMapping" --field Root="/some/local/path" client -o |
  p4 client -i
p4 sync -p # download all the files
p4 client -d $P4CLIENT # remove workspace when you are done.
  1. Use p4 print to print the content of each file as jhwist wrote suggested, so you don't need to define a workspace at all. The disadvantage is that you have to handle each file individually, and create any directories yourself.
p4RemoteRoot="//depot/dir1"
# First, list files and strip output to file path
# because `p4 files` prints something like this:
# //depot/dir1/file1.txt#1 - add change 43444817 (text)
# //depot/dir1/folder/file2.txt#11 - edit change 43783713 (text)
files="$(p4 files $p4RemoteRoot/... | sed 's/\(.*\)#.*/\1/')"
for wsFile in $files; do
  # construct the local path from the remote one
  targetFile="$localPath/${wsFile#$p4RemoteRoot/}"
  # create the parent dir if it doesn't exist. p4 files doesn't list directories
  mkdir -p $(dirname $targetFile)
  # print the file content from remote and write that to the local file.
  p4 print -q $wsFile > $targetFile
done

Note: I couldn't find any documentation for the --field argument, but it seems you can use everything from "Form Fields" as specified in the docs: https://www.perforce.com/manuals/v18.2/cmdref/Content/CmdRef/p4_client.html

3
On

p4 print will output the content onto the standard output (see the excellent manual). Therefore, to answer your questions in order:

  1. It will "download" all files in all subdirectories, but it will only print the file content on stdout. It will not generate files on disk.
  2. Yes, sort of, but not in the way you imagine. Within your stream on stdout, there will be lines like this: //depot/path/to/file#5 - edit change 430530 (text), followed by the content of that particular file.
  3. Nowhere, no files will be created on disk.

If you really don't want to create a client workspace for your task (why?), then you'd have to do something like the following:

  1. Get a list of files with e.g. p4 files (manual)
  2. Iterate over this list and call p4 print for each file
  3. Redirect the output of p4 print to a file on the local disk, adhering to the directory structure in the depot.