supplied.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The director name used by the system administrator. This directive is
required.
\item [Description = \lt{}text\gt{}]
- \index[dir]{Description }
+ \index[dir]{Description}
+ \index[dir]{Directive!Description}
The text field contains a description of the Director that will be displayed
in the graphical user interface. This directive is optional.
\item [Password = \lt{}UA-password\gt{}]
- \index[dir]{Password }
+ \index[dir]{Password}
+ \index[dir]{Directive!Password}
Specifies the password that must be supplied for the default Bacula Console
to be authorized. The same password must appear in the {\bf Director}
resource of the Console configuration file. For added security, the password
blank and you must manually supply it.
\item [Messages = \lt{}Messages-resource-name\gt{}]
- \index[dir]{Messages }
+ \index[dir]{Messages}
+ \index[dir]{Directive!Messages}
The messages resource specifies where to deliver Director messages that are
not associated with a specific Job. Most messages are specific to a job and
will be directed to the Messages resource specified by the job. However,
directive is required.
\item [Working Directory = \lt{}Directory\gt{}]
- \index[dir]{Working Directory }
+ \index[dir]{Working Directory}
+ \index[dir]{Directive!Working Directory}
This directive is mandatory and specifies a directory in which the Director
may put its status files. This directory should be used only by Bacula but
may be shared by other Bacula daemons. However, please note, if this
Directory} is done when the configuration file is read so that values such
as {\bf \$HOME} will be properly expanded. This directive is required.
+ If you have specified a Director user and/or a Director group on your
+ ./configure line with {\bf {-}{-}with-dir-user} and/or
+ {\bf {-}{-}with-dir-group} the Working Directory owner and group will
+ be set to those values.
+
\item [Pid Directory = \lt{}Directory\gt{}]
- \index[dir]{Pid Directory }
+ \index[dir]{Pid Directory}
+ \index[dir]{Directive!Pid Directory}
This directive is mandatory and specifies a directory in which the Director
may put its process Id file. The process Id file is used to shutdown
Bacula and to prevent multiple copies of Bacula from running simultaneously.
Directory} as defined above. This directive is required.
\item [Scripts Directory = \lt{}Directory\gt{}]
- \index[dir]{Scripts Directory }
+ \index[dir]{Scripts Directory}
+ \index[dir]{Directive!Scripts Directory}
This directive is optional and, if defined, specifies a directory in
which the Director will look for the Python startup script {\bf
DirStartup.py}. This directory may be shared by other Bacula daemons.
expanded.
\item [QueryFile = \lt{}Path\gt{}]
- \index[dir]{QueryFile }
+ \index[dir]{QueryFile}
+ \index[dir]{Directive!QueryFile}
This directive is mandatory and specifies a directory and file in which
the Director can find the canned SQL statements for the {\bf Query}
command of the Console. Standard shell expansion of the {\bf Path} is
\label{DirMaxConJobs}
\item [Maximum Concurrent Jobs = \lt{}number\gt{}]
-\index[dir]{Maximum Concurrent Jobs }
+\index[dir]{Maximum Concurrent Jobs}
+\index[dir]{Directive!Maximum Concurrent Jobs}
\index[general]{Simultaneous Jobs}
\index[general]{Concurrent Jobs}
where \lt{}number\gt{} is the maximum number of total Director Jobs that
of this manual.
\item [FD Connect Timeout = \lt{}time\gt{}]
- \index[dir]{FD Connect Timeout }
+ \index[dir]{FD Connect Timeout}
+ \index[dir]{Directive!FD Connect Timeout}
where {\bf time} is the time that the Director should continue
attempting to contact the File daemon to start a job, and after which
the Director will cancel the job. The default is 30 minutes.
\item [SD Connect Timeout = \lt{}time\gt{}]
- \index[dir]{SD Connect Timeout }
+ \index[dir]{SD Connect Timeout}
+ \index[dir]{Directive!SD Connect Timeout}
where {\bf time} is the time that the Director should continue
attempting to contact the Storage daemon to start a job, and after which
the Director will cancel the job. The default is 30 minutes.
\item [DirAddresses = \lt{}IP-address-specification\gt{}]
- \index[dir]{DirAddresses }
+ \index[dir]{DirAddresses}
+ \index[dir]{Directive!DirAddresses}
Specify the ports and addresses on which the Director daemon will listen
for Bacula Console connections. Probably the simplest way to explain
this is to show an example:
ipv6 = {
addr = 1.2.3.4;
port = 1205;
- }
+ }
ip = {
addr = 1.2.3.4
port = 1205
- }
+ }
ip = {
addr = 1.2.3.4
- }
+ }
ip = {
addr = 201:220:222::2
- }
+ }
ip = {
addr = bluedot.thun.net
- }
+ }
}
\end{verbatim}
\normalsize
the resolution can be made either by IPv4 or IPv6. If ip4 is specified, then
only IPv4 resolutions will be permitted, and likewise with ip6.
+Please note that if you use the DirAddresses directive, you must
+not use either a DirPort or a DirAddress directive in the same
+resource.
+
\item [DIRport = \lt{}port-number\gt{}]
- \index[dir]{DIRport }
+ \index[dir]{DIRport}
+ \index[dir]{Directive!DIRport}
Specify the port (a positive integer) on which the Director daemon will
listen for Bacula Console connections. This same port number must be
specified in the Director resource of the Console configuration file. The
directive is not needed if you specify DirAddresses.
\item [DirAddress = \lt{}IP-Address\gt{}]
- \index[dir]{DirAddress }
+ \index[dir]{DirAddress}
+ \index[dir]{Directive!DirAddress}
This directive is optional, but if it is specified, it will cause the
Director server (for the Console program) to bind to the specified {\bf
IP-Address}, which is either a domain name or an IP address specified as a
\item [Job]
\index[dir]{Job}
+ \index[dir]{Directive!Job}
Start of the Job resource. At least one Job resource is required.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The Job name. This name can be specified on the {\bf Run} command in the
console program to start a job. If the name contains spaces, it must be
specified between quotes. It is generally a good idea to give your job the
specify here followed by the date and time the job was scheduled for
execution. This directive is required.
+\item [Enabled = \lt{}yes|no\gt{}]
+ \index[dir]{Enable}
+ \index[dir]Directive!Enable}
+ This directive allows you to enable or disable automatic execution
+ via the scheduler of a Job.
+
\item [Type = \lt{}job-type\gt{}]
- \index[dir]{Type }
+ \index[dir]{Type}
+ \index[dir]{Directive!Type}
The {\bf Type} directive specifies the Job type, which may be one of the
following: {\bf Backup}, {\bf Restore}, {\bf Verify}, or {\bf Admin}. This
directive is required. Within a particular Job Type, there are also Levels
\label{Level}
\item [Level = \lt{}job-level\gt{}]
- \index[dir]{Level }
- The Level directive specifies the default Job level to be run. Each
-different
-Job Type (Backup, Restore, ...) has a different set of Levels that can be
-specified. The Level is normally overridden by a different value that is
-specified in the {\bf Schedule} resource. This directive is not required, but
-must be specified either by a {\bf Level} directive or as an override
-specified in the {\bf Schedule} resource.
+\index[dir]{Level}
+\index[dir]{Directive!Level}
+ The Level directive specifies the default Job level to be run. Each
+ different Job Type (Backup, Restore, ...) has a different set of Levels
+ that can be specified. The Level is normally overridden by a different
+ value that is specified in the {\bf Schedule} resource. This directive
+ is not required, but must be specified either by a {\bf Level} directive
+ or as an override specified in the {\bf Schedule} resource.
For a {\bf Backup} Job, the Level may be one of the following:
\begin{description}
\item [Full]
- \index[dir]{Full}
- is all files in the FileSet whether or not they have changed.
+\index[dir]{Full}
+ When the Level is set to Full all files in the FileSet whether or not
+ they have changed will be backed up.
\item [Incremental]
\index[dir]{Incremental}
- is all files specified in the FileSet that have changed since the last successful backup of the
- the same Job using the same FileSet and Client.
- If the Director cannot find a previous valid Full backup then
- the job will be upgraded into a Full backup. When the Director looks for a
- valid backup record in the catalog database, it looks for a previous
- Job with:
+ When the Level is set to Incremental all files specified in the FileSet
+ that have changed since the last successful backup of the the same Job
+ using the same FileSet and Client, will be backed up. If the Director
+ cannot find a previous valid Full backup then the job will be upgraded
+ into a Full backup. When the Director looks for a valid backup record
+ in the catalog database, it looks for a previous Job with:
\begin{itemize}
\item The same Job name.
different FileSet.
\item The Job was a Full, Differential, or Incremental backup.
\item The Job terminated normally (i.e. did not fail or was not canceled).
- \end{itemize}
+\end{itemize}
-If all the above conditions do not hold, the Director will upgrade the
-Incremental to a Full save. Otherwise, the Incremental backup will be
-performed as requested.
-
-The File daemon (Client) decides which files to backup for an Incremental
-backup by comparing start time of the prior Job (Full, Differential, or
-Incremental) against the time each file was last "modified" (st\_mtime) and
-the time its attributes were last "changed"(st\_ctime). If the file was
-modified or its attributes changed on or after this start time, it will then
-be backed up.
-
-Please note that some virus scanning software may change st\_ctime while
-doing the scan. For example, if the virus scanning program attempts to
-reset the access time (st\_atime), which Bacula does not use, it will cause
-st\_ctime to change and hence Bacula will backup the file during an
-Incremental or Differential backup. In the case of Sophos virus scanning, you
-can prevent it from resetting the access time (st\_atime) and hence changing
-st\_ctime by using the {\bf \verb:--:no-reset-atime} option. For other
-software,
-please see their manual.
-
-When Bacula does an Incremental backup, all modified files that are still on
-the system are backed up. However, any file that has been deleted since the
-last Full backup remains in the Bacula catalog, which means that if between
-a Full save and the time you do a restore, some files are deleted, those
-deleted files will also be restored. The deleted files will no longer appear
-in the catalog after doing another Full save. However, to remove deleted
-files from the catalog during an Incremental backup is quite a time consuming
-process and not currently implemented in Bacula.
-
-In addition, if you move a directory rather than copy it, the files in it do not
-have their modification time (st\_mtime) or their attribute change time
-(st\_ctime)
-changed. As a consequence, those files will probably not be backed up by an
-Incremental
-or Differential backup which depend solely on these time stamps. If you move a
-directory,
-and wish it to be properly backed up, it is generally preferable to copy it,
-then
-delete the original.
+ If all the above conditions do not hold, the Director will upgrade the
+ Incremental to a Full save. Otherwise, the Incremental backup will be
+ performed as requested.
+
+ The File daemon (Client) decides which files to backup for an
+ Incremental backup by comparing start time of the prior Job (Full,
+ Differential, or Incremental) against the time each file was last
+ "modified" (st\_mtime) and the time its attributes were last
+ "changed"(st\_ctime). If the file was modified or its attributes
+ changed on or after this start time, it will then be backed up.
+
+ Some virus scanning software may change st\_ctime while
+ doing the scan. For example, if the virus scanning program attempts to
+ reset the access time (st\_atime), which Bacula does not use, it will
+ cause st\_ctime to change and hence Bacula will backup the file during
+ an Incremental or Differential backup. In the case of Sophos virus
+ scanning, you can prevent it from resetting the access time (st\_atime)
+ and hence changing st\_ctime by using the {\bf \verb:--:no-reset-atime}
+ option. For other software, please see their manual.
+
+ When Bacula does an Incremental backup, all modified files that are
+ still on the system are backed up. However, any file that has been
+ deleted since the last Full backup remains in the Bacula catalog, which
+ means that if between a Full save and the time you do a restore, some
+ files are deleted, those deleted files will also be restored. The
+ deleted files will no longer appear in the catalog after doing another
+ Full save. However, to remove deleted files from the catalog during an
+ Incremental backup is quite a time consuming process and not currently
+ implemented in Bacula.
+
+ In addition, if you move a directory rather than copy it, the files in
+ it do not have their modification time (st\_mtime) or their attribute
+ change time (st\_ctime) changed. As a consequence, those files will
+ probably not be backed up by an Incremental or Differential backup which
+ depend solely on these time stamps. If you move a directory, and wish
+ it to be properly backed up, it is generally preferable to copy it, then
+ delete the original.
\item [Differential]
\index[dir]{Differential}
- is all files specified in the FileSet that have changed since the last
- successful Full backup of the same Job. If the Director cannot find a
+ When the Level is set to Differential
+ all files specified in the FileSet that have changed since the last
+ successful Full backup of the same Job will be backed up.
+ If the Director cannot find a
valid previous Full backup for the same Job, FileSet, and Client,
backup, then the Differential job will be upgraded into a Full backup.
When the Director looks for a valid Full backup record in the catalog
\item The Job terminated normally (i.e. did not fail or was not canceled).
\end{itemize}
-If all the above conditions do not hold, the Director will upgrade the
-Differential to a Full save. Otherwise, the Differential backup will be
-performed as requested.
-
-The File daemon (Client) decides which files to backup for a differential
-backup by comparing the start time of the prior Full backup Job against the
-time each file was last "modified" (st\_mtime) and the time its attributes
-were last "changed" (st\_ctime). If the file was modified or its attributes
-were changed on or after this start time, it will then be backed up. The
-start time used is displayed after the {\bf Since} on the Job report. In rare
-cases, using the start time of the prior backup may cause some files to be
-backed up twice, but it ensures that no change is missed. As with the
-Incremental option, you should ensure that the clocks on your server and
-client are synchronized or as close as possible to avoid the possibility of a
-file being skipped. Note, on versions 1.33 or greater Bacula automatically
-makes the necessary adjustments to the time between the server and the client
-so that the times Bacula uses are synchronized.
-
-When Bacula does a Differential backup, all modified files that are still
-on the system are backed up. However, any file that has been deleted since
-the last Full backup remains in the Bacula catalog, which means that if
-between a Full save and the time you do a restore, some files are deleted,
-those deleted files will also be restored. The deleted files will no
-longer appear in the catalog after doing another Full save. However, to
-remove deleted files from the catalog during a Differential backup is quite
-a time consuming process and not currently implemented in Bacula. It is,
-however, a planned future feature.
-
-
-As noted above, if you move a directory rather than copy it, the
-files in it do not have their modification time (st\_mtime) or
-their attribute change time (st\_ctime) changed. As a
-consequence, those files will probably not be backed up by an
-Incremental or Differential backup which depend solely on these
-time stamps. If you move a directory, and wish it to be
-properly backed up, it is generally preferable to copy it, then
-delete the original. Alternatively, you can move the directory, then
-use the {\bf touch} program to update the timestamps.
-
-Every once and a while, someone asks why we need Differential
-backups as long as Incremental backups pickup all changed files.
-There are possibly many answers to this question, but the one
-that is the most important for me is that it effectively combines
-all the Incremental and Differential backups since the last Full
-backup into a single Differential backup. This has two effects:
-1. It gives some redundancy. 2. More importantly, it reduces the
-number of Volumes that are needed to do a restore effectively
-eliminating the need to read all the volumes on which the
-preceding Incremental and Differential backups since the last
-Full are done.
-
+ If all the above conditions do not hold, the Director will upgrade the
+ Differential to a Full save. Otherwise, the Differential backup will be
+ performed as requested.
+
+ The File daemon (Client) decides which files to backup for a
+ differential backup by comparing the start time of the prior Full backup
+ Job against the time each file was last "modified" (st\_mtime) and the
+ time its attributes were last "changed" (st\_ctime). If the file was
+ modified or its attributes were changed on or after this start time, it
+ will then be backed up. The start time used is displayed after the {\bf
+ Since} on the Job report. In rare cases, using the start time of the
+ prior backup may cause some files to be backed up twice, but it ensures
+ that no change is missed. As with the Incremental option, you should
+ ensure that the clocks on your server and client are synchronized or as
+ close as possible to avoid the possibility of a file being skipped.
+ Note, on versions 1.33 or greater Bacula automatically makes the
+ necessary adjustments to the time between the server and the client so
+ that the times Bacula uses are synchronized.
+
+ When Bacula does a Differential backup, all modified files that are
+ still on the system are backed up. However, any file that has been
+ deleted since the last Full backup remains in the Bacula catalog, which
+ means that if between a Full save and the time you do a restore, some
+ files are deleted, those deleted files will also be restored. The
+ deleted files will no longer appear in the catalog after doing another
+ Full save. However, to remove deleted files from the catalog during a
+ Differential backup is quite a time consuming process and not currently
+ implemented in Bacula. It is, however, a planned future feature.
+
+ As noted above, if you move a directory rather than copy it, the
+ files in it do not have their modification time (st\_mtime) or
+ their attribute change time (st\_ctime) changed. As a
+ consequence, those files will probably not be backed up by an
+ Incremental or Differential backup which depend solely on these
+ time stamps. If you move a directory, and wish it to be
+ properly backed up, it is generally preferable to copy it, then
+ delete the original. Alternatively, you can move the directory, then
+ use the {\bf touch} program to update the timestamps.
+
+ Every once and a while, someone asks why we need Differential
+ backups as long as Incremental backups pickup all changed files.
+ There are possibly many answers to this question, but the one
+ that is the most important for me is that a Differential backup
+ effectively merges
+ all the Incremental and Differential backups since the last Full backup
+ into a single Differential backup. This has two effects: 1. It gives
+ some redundancy since the old backups could be used if the merged backup
+ cannot be read. 2. More importantly, it reduces the number of Volumes
+ that are needed to do a restore effectively eliminating the need to read
+ all the volumes on which the preceding Incremental and Differential
+ backups since the last Full are done.
\end{description}
\begin{description}
\item [InitCatalog]
- \index[dir]{InitCatalog}
+\index[dir]{InitCatalog}
does a scan of the specified {\bf FileSet} and stores the file
attributes in the Catalog database. Since no file data is saved, you
might ask why you would want to do this. It turns out to be a very
the files.
\item [Catalog]
- \index[dir]{Catalog}
+\index[dir]{Catalog}
Compares the current state of the files against the state previously
saved during an {\bf InitCatalog}. Any discrepancies are reported. The
items reported are determined by the {\bf verify} options specified on
track new files.
\item [VolumeToCatalog]
- \index[dir]{VolumeToCatalog}
- This level causes Bacula to read the file attribute data written to the
-Volume from the last Job. The file attribute data are compared to the values
-saved in the Catalog database and any differences are reported. This is
-similar to the {\bf Catalog} level except that instead of comparing the disk
-file attributes to the catalog database, the attribute data written to the
-Volume is read and compared to the catalog database. Although the attribute
-data including the signatures (MD5 or SHA1) are compared, the actual file data
-is not compared (it is not in the catalog).
-
-Please note! If you run two Verify VolumeToCatalog jobs on the same client at
-the same time, the results will certainly be incorrect. This is because the
-Verify VolumeToCatalog modifies the Catalog database while running.
+\index[dir]{VolumeToCatalog}
+ This level causes Bacula to read the file attribute data written to the
+ Volume from the last Job. The file attribute data are compared to the
+ values saved in the Catalog database and any differences are reported.
+ This is similar to the {\bf Catalog} level except that instead of
+ comparing the disk file attributes to the catalog database, the
+ attribute data written to the Volume is read and compared to the catalog
+ database. Although the attribute data including the signatures (MD5 or
+ SHA1) are compared, the actual file data is not compared (it is not in
+ the catalog).
+
+ Please note! If you run two Verify VolumeToCatalog jobs on the same
+ client at the same time, the results will certainly be incorrect. This
+ is because the Verify VolumeToCatalog modifies the Catalog database
+ while running.
\item [DiskToCatalog]
- \index[dir]{DiskToCatalog}
- This level causes Bacula to read the files as they currently are on disk,
-and
-to compare the current file attributes with the attributes saved in the
-catalog from the last backup for the job specified on the {\bf VerifyJob}
-directive. This level differs from the {\bf Catalog} level described above by
-the fact that it doesn't compare against a previous Verify job but against a
-previous backup. When you run this level, you must supply the verify options
-on your Include statements. Those options determine what attribute fields are
-compared.
-
-This command can be very useful if you have disk problems because it will
-compare the current state of your disk against the last successful backup,
-which may be several jobs.
-
-Note, the current implementation (1.32c) does not identify files that have
-been deleted.
+\index[dir]{DiskToCatalog}
+ This level causes Bacula to read the files as they currently are on
+ disk, and to compare the current file attributes with the attributes
+ saved in the catalog from the last backup for the job specified on the
+ {\bf VerifyJob} directive. This level differs from the {\bf Catalog}
+ level described above by the fact that it doesn't compare against a
+ previous Verify job but against a previous backup. When you run this
+ level, you must supply the verify options on your Include statements.
+ Those options determine what attribute fields are compared.
+
+ This command can be very useful if you have disk problems because it
+ will compare the current state of your disk against the last successful
+ backup, which may be several jobs.
+
+ Note, the current implementation (1.32c) does not identify files that
+ have been deleted.
\end{description}
\item [Verify Job = \lt{}Job-Resource-Name\gt{}]
- \index[dir]{Verify Job }
- If you run a verify job without this directive, the last job run will be
-compared with the catalog, which means that you must immediately follow a
-backup by a verify command. If you specify a {\bf Verify Job} Bacula will
-find the last job with that name that ran. This permits you to run all your
-backups, then run Verify jobs on those that you wish to be verified (most
-often a {\bf VolumeToCatalog}) so that the tape just written is re-read.
+ \index[dir]{Verify Job}
+ \index[dir]{Directive!Verify Job}
+ If you run a verify job without this directive, the last job run will be
+ compared with the catalog, which means that you must immediately follow
+ a backup by a verify command. If you specify a {\bf Verify Job} Bacula
+ will find the last job with that name that ran. This permits you to run
+ all your backups, then run Verify jobs on those that you wish to be
+ verified (most often a {\bf VolumeToCatalog}) so that the tape just
+ written is re-read.
\item [JobDefs = \lt{}JobDefs-Resource-Name\gt{}]
- \index[dir]{JobDefs }
- If a JobDefs-Resource-Name is specified, all the values contained in the
-named JobDefs resource will be used as the defaults for the current Job. Any
-value that you explicitly define in the current Job resource, will override
-any defaults specified in the JobDefs resource. The use of this directive
-permits writing much more compact Job resources where the bulk of the
-directives are defined in one or more JobDefs. This is particularly useful if
-you have many similar Jobs but with minor variations such as different
-Clients. A simple example of the use of JobDefs is provided in the default
-bacula-dir.conf file.
+\index[dir]{JobDefs}
+\index[dir]{Directive!JobDefs}
+ If a JobDefs-Resource-Name is specified, all the values contained in the
+ named JobDefs resource will be used as the defaults for the current Job.
+ Any value that you explicitly define in the current Job resource, will
+ override any defaults specified in the JobDefs resource. The use of
+ this directive permits writing much more compact Job resources where the
+ bulk of the directives are defined in one or more JobDefs. This is
+ particularly useful if you have many similar Jobs but with minor
+ variations such as different Clients. A simple example of the use of
+ JobDefs is provided in the default bacula-dir.conf file.
\item [Bootstrap = \lt{}bootstrap-file\gt{}]
- \index[dir]{Bootstrap }
- The Bootstrap directive specifies a bootstrap file that, if provided, will
-be used during {\bf Restore} Jobs and is ignored in other Job types. The {\bf
-bootstrap} file contains the list of tapes to be used in a restore Job as
-well as which files are to be restored. Specification of this directive is
-optional, and if specified, it is used only for a restore job. In addition,
-when running a Restore job from the console, this value can be changed.
-
-If you use the {\bf Restore} command in the Console program, to start a
-restore job, the {\bf bootstrap} file will be created automatically from the
-files you select to be restored.
-
-For additional details of the {\bf bootstrap} file, please see
-\ilink{Restoring Files with the Bootstrap File}{_ChapterStart43}
-chapter of this manual.
+\index[dir]{Bootstrap}
+\index[dir]{Directive!Bootstrap}
+ The Bootstrap directive specifies a bootstrap file that, if provided,
+ will be used during {\bf Restore} Jobs and is ignored in other Job
+ types. The {\bf bootstrap} file contains the list of tapes to be used
+ in a restore Job as well as which files are to be restored.
+ Specification of this directive is optional, and if specified, it is
+ used only for a restore job. In addition, when running a Restore job
+ from the console, this value can be changed.
+
+ If you use the {\bf Restore} command in the Console program, to start a
+ restore job, the {\bf bootstrap} file will be created automatically from
+ the files you select to be restored.
+
+ For additional details of the {\bf bootstrap} file, please see
+ \ilink{Restoring Files with the Bootstrap File}{_ChapterStart43} chapter
+ of this manual.
\label{writebootstrap}
\item [Write Bootstrap = \lt{}bootstrap-file-specification\gt{}]
- \index[dir]{a name}
- The {\bf writebootstrap} directive specifies a file name where Bacula will
-write a {\bf bootstrap} file for each Backup job run. Thus this directive
-applies only to Backup Jobs. If the Backup job is a Full save, Bacula will
-erase any current contents of the specified file before writing the bootstrap
-records. If the Job is an Incremental save, Bacula will append the current
-bootstrap record to the end of the file.
-
-Using this feature, permits you to constantly have a bootstrap file that can
-recover the current state of your system. Normally, the file specified should
-be a mounted drive on another machine, so that if your hard disk is lost,
-you will immediately have a bootstrap record available. Alternatively, you
-should copy the bootstrap file to another machine after it is updated.
-
-If the {\bf bootstrap-file-specification} begins with a vertical bar (|),
-Bacula will use the specification as the name of a program to which it will
-pipe the bootstrap record. It could for example be a shell script that emails
-you the bootstrap record.
-
-For more details on using this file, please see the chapter entitled
-\ilink{The Bootstrap File}{_ChapterStart43} of this manual.
+\index[dir]{Write Bootstrap}
+\index[dir]{Directive!Write Bootstrap}
+ The {\bf writebootstrap} directive specifies a file name where Bacula
+ will write a {\bf bootstrap} file for each Backup job run. Thus this
+ directive applies only to Backup Jobs. If the Backup job is a Full
+ save, Bacula will erase any current contents of the specified file
+ before writing the bootstrap records. If the Job is an Incremental
+ save, Bacula will append the current bootstrap record to the end of the
+ file.
+
+ Using this feature, permits you to constantly have a bootstrap file that
+ can recover the current state of your system. Normally, the file
+ specified should be a mounted drive on another machine, so that if your
+ hard disk is lost, you will immediately have a bootstrap record
+ available. Alternatively, you should copy the bootstrap file to another
+ machine after it is updated.
+
+ If the {\bf bootstrap-file-specification} begins with a vertical bar
+ (|), Bacula will use the specification as the name of a program to which
+ it will pipe the bootstrap record. It could for example be a shell
+ script that emails you the bootstrap record.
+
+ For more details on using this file, please see the chapter entitled
+ \ilink{The Bootstrap File}{_ChapterStart43} of this manual.
\item [Client = \lt{}client-resource-name\gt{}]
- \index[dir]{Client }
+\index[dir]{Client}
+\index[dir]{Directive!Client}
The Client directive specifies the Client (File daemon) that will be used in
the current Job. Only a single Client may be specified in any one Job. The
Client runs on the machine to be backed up, and sends the requested files to
This directive is required.
\item [FileSet = \lt{}FileSet-resource-name\gt{}]
- \index[dir]{FileSet }
- The FileSet directive specifies the FileSet that will be used in the
-current
- Job. The FileSet specifies which directories (or files) are to be backed up,
- and what options to use (e.g. compression, ...). Only a single FileSet
- resource may be specified in any one Job. For additional details, see the
- \ilink{FileSet Resource section}{FileSetResource} of this
- chapter. This directive is required.
+\index[dir]{FileSet}
+\index[dir]{FileSet}
+ The FileSet directive specifies the FileSet that will be used in the
+ current Job. The FileSet specifies which directories (or files) are to
+ be backed up, and what options to use (e.g. compression, ...). Only a
+ single FileSet resource may be specified in any one Job. For additional
+ details, see the \ilink{FileSet Resource section}{FileSetResource} of
+ this chapter. This directive is required.
\item [Messages = \lt{}messages-resource-name\gt{}]
- \index[dir]{Messages }
- The Messages directive defines what Messages resource should be used for
-this
- job, and thus how and where the various messages are to be delivered. For
- example, you can direct some messages to a log file, and others can be sent
- by email. For additional details, see the
- \ilink{Messages Resource}{_ChapterStart15} Chapter of this
- manual. This directive is required.
+\index[dir]{Messages}
+\index[dir]{Directive!Messages}
+ The Messages directive defines what Messages resource should be used for
+ this job, and thus how and where the various messages are to be
+ delivered. For example, you can direct some messages to a log file, and
+ others can be sent by email. For additional details, see the
+ \ilink{Messages Resource}{_ChapterStart15} Chapter of this manual. This
+ directive is required.
\item [Pool = \lt{}pool-resource-name\gt{}]
- \index[dir]{Pool }
- The Pool directive defines the pool of Volumes where your data can be backed
- up. Many Bacula installations will use only the {\bf Default} pool. However,
- if you want to specify a different set of Volumes for different Clients or
- different Jobs, you will probably want to use Pools. For additional details,
- see the
- \ilink{Pool Resource section}{PoolResource} of this chapter. This
- directive is required.
+\index[dir]{Pool}
+\index[dir]{Directive!Pool}
+ The Pool directive defines the pool of Volumes where your data can be
+ backed up. Many Bacula installations will use only the {\bf Default}
+ pool. However, if you want to specify a different set of Volumes for
+ different Clients or different Jobs, you will probably want to use
+ Pools. For additional details, see the \ilink{Pool Resource
+ section}{PoolResource} of this chapter. This directive is required.
\item [Full Backup Pool = \lt{}pool-resource-name\gt{}]
- \index[dir]{Full Backup Pool }
- The {\it Full Backup Pool} specifies a Pool to be used for Full backups. It
- will override any Pool specification during a Full backup. This directive is
- optional.
+\index[dir]{Full Backup Pool}
+\index[dir]{Directive!Full Backup Pool}
+ The {\it Full Backup Pool} specifies a Pool to be used for Full backups.
+ It will override any Pool specification during a Full backup. This
+ directive is optional.
\item [Differential Backup Pool = \lt{}pool-resource-name\gt{}]
- \index[dir]{Differential Backup Pool }
- The {\it Differential Backup Pool} specifies a Pool to be used for
- Differential backups. It will override any Pool specification during a
- Differential backup. This directive is optional.
+\index[dir]{Differential Backup Pool}
+\index[dir]{Directive!Differential Backup Pool}
+ The {\it Differential Backup Pool} specifies a Pool to be used for
+ Differential backups. It will override any Pool specification during a
+ Differential backup. This directive is optional.
\item [Incremental Backup Pool = \lt{}pool-resource-name\gt{}]
- \index[dir]{Incremental Backup Pool }
- The {\it Incremental Backup Pool} specifies a Pool to be used for
-Incremental
- backups. It will override any Pool specification during an Incremental
-backup.
- This directive is optional.
+\index[dir]{Incremental Backup Pool}
+\index[dir]{Directive!Incremental Backup Pool}
+ The {\it Incremental Backup Pool} specifies a Pool to be used for
+ Incremental backups. It will override any Pool specification during an
+ Incremental backup. This directive is optional.
\item [Schedule = \lt{}schedule-name\gt{}]
- \index[dir]{Schedule }
+\index[dir]{Schedule}
+\index[dir]{Directive!Schedule}
The Schedule directive defines what schedule is to be used for the Job.
The schedule in turn determines when the Job will be automatically
started and what Job level (i.e. Full, Incremental, ...) is to be run.
\item [Storage = \lt{}storage-resource-name\gt{}]
- \index[dir]{Storage }
- The Storage directive defines the name of the storage services where you
-want
- to backup the FileSet data. For additional details, see the
+\index[dir]{Storage}
+\index[dir]{Directive!Storage}
+ The Storage directive defines the name of the storage services where you
+ want to backup the FileSet data. For additional details, see the
\ilink{Storage Resource Chapter}{StorageResource2} of this manual.
- This directive is required.
+ This directive is required.
\item [Max Start Delay = \lt{}time\gt{}]
- \index[dir]{Max Start Delay }
+\index[dir]{Max Start Delay}
+\index[dir]{Directive!Max Start Delay}
The time specifies the maximum delay between the scheduled time and the
actual start time for the Job. For example, a job can be scheduled to
run at 1:00am, but because other jobs are running, it may wait to run.
which indicates no limit.
\item [Max Run Time = \lt{}time\gt{}]
- \index[dir]{Max Run Time }
+\index[dir]{Max Run Time}
+\index[dir]{Directive!Max Run Time}
The time specifies the maximum allowed time that a job may run, counted
from when the job starts, ({\bf not} necessarily the same as when the
job was scheduled). This directive is implemented in version 1.33 and
later.
\item [Max Wait Time = \lt{}time\gt{}]
- \index[dir]{Max Wait Time }
+\index[dir]{Max Wait Time}
+\index[dir]{Directive!Max Wait Time}
The time specifies the maximum allowed time that a job may block waiting
for a resource (such as waiting for a tape to be mounted, or waiting for
the storage or file daemons to perform their duties), counted from the
scheduled). This directive is implemented only in version 1.33 and
later.
-
-
\item [Incremental Max Wait Time = \lt{}time\gt{}]
- \index[dir]{Incremental Max Wait Time }
+\index[dir]{Incremental Max Wait Time}
+\index[dir]{Directive!Incremental Max Wait Time}
The time specifies the maximum allowed time that an Incremental backup
job may block waiting for a resource (such as waiting for a tape to be
mounted, or waiting for the storage or file daemons to perform their
{\bf Max Wait Time} it may also be applied to the job.
\item [Differential Max Wait Time = \lt{}time\gt{}]
- \index[dir]{Differential Max Wait Time }
+\index[dir]{Differential Max Wait Time}
+\index[dir]{Directive!Differential Max Wait Time}
The time specifies the maximum allowed time that a Differential backup
job may block waiting for a resource (such as waiting for a tape to be
mounted, or waiting for the storage or file daemons to perform their
{\bf Max Wait Time} it may also be applied to the job.
\item [Prefer Mounted Volumes = \lt{}yes|no\gt{}]
- \index[dir]{Prefer Mounted Volumes}
+\index[dir]{Prefer Mounted Volumes}
+\index[dir]{Directive!Prefer Mounted Volumes}
If the Prefer Mounted Volumes directive is set to {\bf yes} (default
yes), the Storage daemon is requested to select either an Autochanger or
a drive with a valid Volume already mounted in preference to a drive
\item [Prune Jobs = \lt{}yes|no\gt{}]
- \index[dir]{Prune Jobs }
+\index[dir]{Prune Jobs}
+\index[dir]{Directive!Prune Jobs}
Normally, pruning of Jobs from the Catalog is specified on a Client by
Client basis in the Client resource with the {\bf AutoPrune} directive.
If this directive is specified (not normally) and the value is {\bf
\item [Prune Files = \lt{}yes|no\gt{}]
- \index[dir]{Prune Files }
+\index[dir]{Prune Files}
+\index[dir]{Directive!Prune Files}
Normally, pruning of Files from the Catalog is specified on a Client by
Client basis in the Client resource with the {\bf AutoPrune} directive.
If this directive is specified (not normally) and the value is {\bf
default is {\bf no}.
\item [Prune Volumes = \lt{}yes|no\gt{}]
- \index[dir]{Prune Volumes }
+\index[dir]{Prune Volumes}
+\index[dir]{Directive!Prune Volumes}
Normally, pruning of Volumes from the Catalog is specified on a Client
by Client basis in the Client resource with the {\bf AutoPrune}
directive. If this directive is specified (not normally) and the value
is {\bf yes}, it will override the value specified in the Client
resource. The default is {\bf no}.
-\item [Run Before Job = \lt{}command\gt{}]
- \index[dir]{Run Before Job }
- The specified {\bf command} is run as an external program prior to
- running the current Job. Any output sent by the command to standard output
- will be included in the Bacula job report. The command string must be a
- valid program name or name of a shell script. This directive is not
- required, but if it is defined, and if the exit code of the program run
- is non-zero, the current Bacula job will be canceled. In addition, the
- command string is parsed then fed to the execvp() function, which means
- that the path will be searched to execute your specified command, but
- there is no shell interpretation, as a consequence, if you invoke
- complicated commands or want any shell features such as redirection or
- piping, you must call a shell script and do it inside that script.
+\item [RunScript \{...\}]
+ \index[dir]{RunScript}
+ \index[dir]{Directive!Run Script}
+
+ The specified {\bf command} is run as an external program prior or after the
+ current Job. This directive is optional.
+
+ You can use following options :
+\begin{tabular}{|c|c|c|l}
+Options & Value & Default & Informations \\
+\hline
+\hline
+Runs On Success & Yes/No & {\it Yes} & Run command if JobStatus is successful\\
+\hline
+Runs On Failure & Yes/No & {\it No} & Run command if JobStatus isn't successful\\
+\hline
+Runs On Client & Yes/No & {\it Yes} & Run command on client\\
+\hline
+Runs When & Before|After|Always & {\it Never} & When run commands\\
+\hline
+Abort Job On Error & Yes/No & {\it Yes} & Abort job if script return
+ something different from 0 \\
+\hline
+Command & & & Path to your script\\
+\hline
+\end{tabular}
+
+ Any output sent by the command to standard output will be included in the
+ Bacula job report. The command string must be a valid program name or name
+ of a shell script.
+
+ In addition, the command string is parsed then fed to the execvp() function,
+ which means that the path will be searched to execute your specified
+ command, but there is no shell interpretation, as a consequence, if you
+ invoke complicated commands or want any shell features such as redirection
+ or piping, you must call a shell script and do it inside that script.
Before submitting the specified command to the operating system, Bacula
performs character substitution of the following characters:
%% = %
%c = Client's name
%d = Director's name
- %i = JobId
%e = Job Exit Status
- %j = Unique Job name
+ %i = JobId
+ %j = Unique Job id
%l = Job Level
%n = Job name
- %t = Job type
+ %s = Since time
+ %t = Job type (Backup, ...)
%v = Volume name
\end{verbatim}
Thus if you edit it on a command line, you will need to enclose
it within some sort of quotes.
-
- Bacula checks the exit status of the RunBeforeJob program. If it is
- non-zero, the job will be error terminated. Lutz Kittler has pointed
- out that using the RunBeforJob directive can be a simple way to modify
- your schedules during a holiday. For example, suppose that you normally
- do Full backups on Fridays, but Thursday and Friday are holidays. To
- avoid having to change tapes between Thursday and Friday when no one is
- in the office, you can create a RunBeforeJob that returns a non-zero
- status on Thursday and zero on all other days. That way, the Thursday
- job will not run, and on Friday the tape you inserted on Wednesday
- before leaving will be used.
-\item [Run After Job = \lt{}command\gt{}]
- \index[dir]{Run After Job }
- The specified {\bf command} is run as an external program after the
- current job terminates. This directive is not required. The command
- string must be a valid program name or name of a shell script. If the
- exit code of the program run is non-zero, the current Bacula job will
- terminate in error. Before submitting the specified command to the
- operating system, Bacula performs character substitution as described
- above for the {\bf Run Before Job} directive.
-
- An example of the use of this directive is given in the
- \ilink{Tips Chapter}{JobNotification} of this manual. As of version
- 1.30, Bacula checks the exit status of the RunAfter program. If it is
- non-zero, the job will be terminated in error.
-\item [Client Run Before Job = \lt{}command\gt{}]
- \index[dir]{Client Run Before Job }
- This directive is the same as {\bf Run Before Job} except that the program is run on
- the client machine. The same restrictions apply to Unix systems as noted
- above for the {\bf Run Before Job}. In addition, for a Windows client on
- version 1.33 and above, please take careful note that you must ensure a
- correct path to your script. The script or program can be a .com, .exe or
- a .bat file. However, if you specify a path, you must also specify the full
- extension. Unix like commands will not work unless you have installed and
- properly configured Cygwin in addition to and separately from Bacula.
-
+You can use these following shortcuts :
+\begin{tabular}{|c|c|c|c|c|c}
+Keyword & RunsOnSuccess & RunsOnFailure & AbortJobOnError & Runs On Client & RunsWhen \\
+\hline
+Run Before Job & & & Yes & No & Before \\
+\hline
+Run After Job & Yes & No & & No & After \\
+\hline
+Run After Failed Job & No & Yes & & No & After \\
+\hline
+Client Run Before Job & & & Yes & Yes & Before \\
+\hline
+Client Run After Job & Yes & No & & Yes & After \\
+\hline
+Client Run After Failed Job & No & Yes & & Yes & After \\
+\end{tabular}
+
+Example :
+\begin{verbatim}
+RunScript {
+ RunsWhen = Before
+ AbortJobOnError = No
+ Command = "/etc/init.d/apache stop"
+}
+
+RunScript {
+ RunsWhen = After
+ RunsOnFailure = yes
+ Command = "/etc/init.d/apache start"
+}
+\end{verbatim}
+
{\bf Special Windows Considerations}
- The command can be anything that cmd.exe or command.com will recognize as an
- executable file. Specifying the executable's extension is optional, unless
- there is an ambiguity. (i.e. ls.bat, ls.exe)
+
+ In addition, for a Windows client on version 1.33 and above, please take
+ careful note that you must ensure a correct path to your script. The
+ script or program can be a .com, .exe or a .bat file. However, if you
+ specify a path, you must also specify the full extension. Unix like
+ commands will not work unless you have installed and properly configured
+ Cygwin in addition to and separately from Bacula.
+
+ The command can be anything that cmd.exe or command.com will recognize
+ as an executable file. Specifying the executable's extension is
+ optional, unless there is an ambiguity. (i.e. ls.bat, ls.exe)
- The System \%Path\% will be searched for the command. (under the environment
- variable dialog you have have both System Environment and User Environment,
- we believe that only the System environment will be available to bacula-fd,
- if it is running as a service.)
+ The System \%Path\% will be searched for the command. (under the
+ environment variable dialog you have have both System Environment and
+ User Environment, we believe that only the System environment will be
+ available to bacula-fd, if it is running as a service.)
System environment variables can be referenced with \%var\% and
used as either part of the command name or arguments.
- When specifying a full path to an executable if the path or executable name
- contains whitespace or special characters they will need to be quoted.
- Arguments containing whitespace or special characters will also have to be
- quoted.
\footnotesize
\begin{verbatim}
\end{verbatim}
\normalsize
- The special characters \&()[]\{\}\^{}=;!'+,`\~{} will need to be quoted if
- they are part of a filename or argument.
+ The special characters \&()[]\{\}\^{}=;!'+,`\~{} will need to be quoted
+ if they are part of a filename or argument.
- If someone is logged in, a blank "command" window running the commands
-will
- be present during the execution of the command.
+ If someone is logged in, a blank "command" window running the commands
+ will be present during the execution of the command.
- Some Suggestions from Phil Stracchino for running on Win32 machines with the
- native Win32 File daemon:
+ Some Suggestions from Phil Stracchino for running on Win32 machines with
+ the native Win32 File daemon:
\begin{enumerate}
- \item You might want the ClientRunBeforeJob directive to specify a .bat file
- which runs the actual client-side commands, rather than trying to run
-(for
- example) regedit /e directly.
+ \item You might want the ClientRunBeforeJob directive to specify a .bat
+ file which runs the actual client-side commands, rather than trying
+ to run (for example) regedit /e directly.
\item The batch file should explicitly 'exit 0' on successful completion.
\item The path to the batch file should be specified in Unix form:
'%l'\""
\end{verbatim}
\normalsize
- When the job is run, you will get messages from the output of the script
-stating
- that the backup has started. Even though the command being run is
- backgrounded with \&, the job will block until the "db2 BACKUP DATABASE"
-command,
- thus the backup stalls.
+
+When the job is run, you will get messages from the output of the script
+stating that the backup has started. Even though the command being run is
+backgrounded with \&, the job will block until the "db2 BACKUP DATABASE"
+command, thus the backup stalls.
- To remedy this situation, the "db2 BACKUP DATABASE" line should be changed to
+To remedy this situation, the "db2 BACKUP DATABASE" line should be changed to
the following:
\footnotesize
It is important to redirect the input and outputs of a backgrounded command to
/dev/null to prevent the script from blocking.
+\item [Run Before Job = \lt{}command\gt{}]
+\index[dir]{Run Before Job}
+\index[dir]{Directive!Run Before Job}
+\index[dir]{Directive!Run Before Job}
+The specified {\bf command} is run as an external program prior to running the
+current Job. This directive is not required, but if it is defined, and if the
+exit code of the program run is non-zero, the current Bacula job will be
+canceled.
+
+\begin{verbatim}
+Run Before Job = "echo test"
+\end{verbatim}
+ it's equivalent to :
+\begin{verbatim}
+RunScript {
+ Command = "echo test"
+ RunsOnClient = No
+ RunsWhen = Before
+}
+\end{verbatim}
+
+ Lutz Kittler has pointed out that using the RunBeforJob directive can be a
+ simple way to modify your schedules during a holiday. For example, suppose
+ that you normally do Full backups on Fridays, but Thursday and Friday are
+ holidays. To avoid having to change tapes between Thursday and Friday when
+ no one is in the office, you can create a RunBeforeJob that returns a
+ non-zero status on Thursday and zero on all other days. That way, the
+ Thursday job will not run, and on Friday the tape you inserted on Wednesday
+ before leaving will be used.
+
+\item [Run After Job = \lt{}command\gt{}]
+\index[dir]{Run After Job}
+\index[dir]{Directive!Run After Job}
+ The specified {\bf command} is run as an external program if the current
+ job terminates normally (without error or without being canceled). This
+ directive is not required. If the exit code of the program run is
+ non-zero, Bacula will print a warning message. Before submitting the
+ specified command to the operating system, Bacula performs character
+ substitution as described above for the {\bf RunScript} directive.
+
+ An example of the use of this directive is given in the
+ \ilink{Tips Chapter}{JobNotification} of this manual.
+
+ See the {\bf Run After Failed Job} if you
+ want to run a script after the job has terminated with any
+ non-normal status.
+
+\item [Run After Failed Job = \lt{}command\gt{}]
+\index[dir]{Run After Job}
+\index[dir]{Directive!Run After Job}
+ The specified {\bf command} is run as an external program after the current
+ job terminates with any error status. This directive is not required. The
+ command string must be a valid program name or name of a shell script. If
+ the exit code of the program run is non-zero, Bacula will print a
+ warning message. Before submitting the specified command to the
+ operating system, Bacula performs character substitution as described above
+ for the {\bf RunScript} directive. Note, if you wish that your script
+ will run regardless of the exit status of the Job, you can use this :
+\begin{verbatim}
+RunScript {
+ Command = "echo test"
+ RunsWhen = After
+ RunsOnFailure = yes
+ RunsOnClient = no
+ RunsOnSuccess = yes # default, you can drop this line
+}
+\end{verbatim}
+
+ An example of the use of this directive is given in the
+ \ilink{Tips Chapter}{JobNotification} of this manual.
+
+
+\item [Client Run Before Job = \lt{}command\gt{}]
+\index[dir]{Client Run Before Job}
+\index[dir]{Directive!Client Run Before Job}
+ This directive is the same as {\bf Run Before Job} except that the
+ program is run on the client machine. The same restrictions apply to
+ Unix systems as noted above for the {\bf RunScript}.
\item [Client Run After Job = \lt{}command\gt{}]
- \index[dir]{Client Run After Job }
- This directive is the same as {\bf Run After Job} except that it is run on
-the
- client machine. Note, please see the notes above in {\bf Client Run Before
- Job} concerning Windows clients.
+ \index[dir]{Client Run After Job}
+ \index[dir]{Directive!Client Run After Job}
+ This directive is the same as {\bf Run After Job} except that it is run on
+ the client machine. Note, please see the notes above in {\bf RunScript}
+ concerning Windows clients.
\item [Rerun Failed Levels = \lt{}yes|no\gt{}]
- \index[dir]{Rerun Failed Levels }
- If this directive is set to {\bf yes} (default no), and Bacula detects that
-a
- previous job at a higher level (i.e. Full or Differential) has failed, the
- current job level will be upgraded to the higher level. This is particularly
- useful for Laptops where they may often be unreachable, and if a prior Full
- save has failed, you wish the very next backup to be a Full save rather
-than
- whatever level it is started as.
+ \index[dir]{Rerun Failed Levels}
+ \index[dir]{Directive!Rerun Failed Levels}
+ If this directive is set to {\bf yes} (default no), and Bacula detects that
+ a previous job at a higher level (i.e. Full or Differential) has failed,
+ the current job level will be upgraded to the higher level. This is
+ particularly useful for Laptops where they may often be unreachable, and if
+ a prior Full save has failed, you wish the very next backup to be a Full
+ save rather than whatever level it is started as.
\item [Spool Data = \lt{}yes|no\gt{}]
- \index[dir]{Spool Data }
+ \index[dir]{Spool Data}
+ \index[dir]{Directive!Spool Data}
If this directive is set to {\bf yes} (default no), the Storage daemon will
-be requested to spool the data for this Job to disk rather than write it
-directly to tape. Once all the data arrives or the spool files' maximum sizes
-are reached, the data will be despooled and written to tape. When this
-directive is set to yes, the Spool Attributes is also automatically set to
-yes. Spooling data prevents tape shoe-shine (start and stop) during
-Incremental saves. This option should not be used if you are writing to a
-disk file.
+ be requested to spool the data for this Job to disk rather than write it
+ directly to tape. Once all the data arrives or the spool files' maximum sizes
+ are reached, the data will be despooled and written to tape. When this
+ directive is set to yes, the Spool Attributes is also automatically set to
+ yes. Spooling data prevents tape shoe-shine (start and stop) during
+ Incremental saves. This option should not be used if you are writing to a
+ disk file.
\item [Spool Attributes = \lt{}yes|no\gt{}]
- \index[dir]{Spool Attributes }
+ \index[dir]{Spool Attributes}
+ \index[dir]{Directive!Spool Attributes}
+ \index[dir]{slow}
+ \index[general]{slow}
+ \index[dir]{Backups!slow}
+ \index[general]{Backups!slow}
The default is set to {\bf no}, which means that the File attributes are
sent
by the Storage daemon to the Director as they are stored on tape. However,
will be sent to the Director.
\item [Where = \lt{}directory\gt{}]
- \index[dir]{Where }
- This directive applies only to a Restore job and specifies a prefix to the
-directory name of all files being restored. This permits files to be restored
-in a different location from which they were saved. If {\bf Where} is not
-specified or is set to backslash ({\bf /}), the files will be restored to
-their original location. By default, we have set {\bf Where} in the example
-configuration files to be {\bf /tmp/bacula-restores}. This is to prevent
-accidental overwriting of your files.
+ \index[dir]{Where}
+ \index[dir]{Directive!Where}
+ This directive applies only to a Restore job and specifies a prefix to
+ the directory name of all files being restored. This permits files to
+ be restored in a different location from which they were saved. If {\bf
+ Where} is not specified or is set to backslash ({\bf /}), the files will
+ be restored to their original location. By default, we have set {\bf
+ Where} in the example configuration files to be {\bf
+ /tmp/bacula-restores}. This is to prevent accidental overwriting of
+ your files.
\item [Replace = \lt{}replace-option\gt{}]
- \index[dir]{Replace }
- This directive applies only to a Restore job and specifies what happens when
-Bacula wants to restore a file or directory that already exists. You have the
- following options for {\bf replace-option}:
+ \index[dir]{Replace}
+ \index[dir]{Directive!Replace}
+ This directive applies only to a Restore job and specifies what happens
+ when Bacula wants to restore a file or directory that already exists.
+ You have the following options for {\bf replace-option}:
\begin{description}
\item [always]
\index[dir]{always}
- when the file to be restored already exists, it is deleted and then replaced
-by
- the copy that was backed up.
+ when the file to be restored already exists, it is deleted and then
+ replaced by the copy that was backed up.
\item [ifnewer]
- \index[dir]{ifnewer}
- if the backed up file (on tape) is newer than the existing file, the existing
- file is deleted and replaced by the back up.
+\index[dir]{ifnewer}
+ if the backed up file (on tape) is newer than the existing file, the
+ existing file is deleted and replaced by the back up.
\item [ifolder]
\index[dir]{ifolder}
- if the backed up file (on tape) is older than the existing file, the existing
- file is deleted and replaced by the back up.
+ if the backed up file (on tape) is older than the existing file, the
+ existing file is deleted and replaced by the back up.
\item [never]
\index[dir]{never}
\item [Prefix Links=\lt{}yes|no\gt{}]
\index[dir]{Prefix Links}
+ \index[dir]{Directive!Prefix Links}
If a {\bf Where} path prefix is specified for a recovery job, apply it
to absolute links as well. The default is {\bf No}. When set to {\bf
Yes} then while restoring files to an alternate directory, any absolute
original locations, all files linked with absolute names will be broken.
\item [Maximum Concurrent Jobs = \lt{}number\gt{}]
- \index[dir]{Maximum Concurrent Jobs }
+ \index[dir]{Maximum Concurrent Jobs}
+ \index[dir]{Directive!Maximum Concurrent Jobs}
where \lt{}number\gt{} is the maximum number of Jobs from the current
Job resource that can run concurrently. Note, this directive limits
only Jobs with the same name as the resource in which it appears. Any
Director's resource.
\item [Reschedule On Error = \lt{}yes|no\gt{}]
- \index[dir]{Reschedule On Error }
+ \index[dir]{Reschedule On Error}
+ \index[dir]{Directive!Reschedule On Error}
If this directive is enabled, and the job terminates in error, the job
will be rescheduled as determined by the {\bf Reschedule Interval} and
{\bf Reschedule Times} directives. If you cancel the job, it will not
machines that are not always connected to the network or switched on.
\item [Reschedule Interval = \lt{}time-specification\gt{}]
- \index[dir]{Reschedule Interval }
+ \index[dir]{Reschedule Interval}
+ \index[dir]{Directive!Reschedule Interval}
If you have specified {\bf Reschedule On Error = yes} and the job
terminates in error, it will be rescheduled after the interval of time
specified by {\bf time-specification}. See \ilink{the time
rescheduled on error.
\item [Reschedule Times = \lt{}count\gt{}]
- \index[dir]{Reschedule Times }
+ \index[dir]{Reschedule Times}
+ \index[dir]{Directive!Reschedule Times}
This directive specifies the maximum number of times to reschedule the
job. If it is set to zero (the default) the job will be rescheduled an
indefinite number of times.
\item [Run = \lt{}job-name\gt{}]
- \index[dir]{Run directive}
+ \index[dir]{Run}
+ \index[dir]{Directive!Run}
\index[dir]{Clone a Job}
The Run directive (not to be confused with the Run option in a
Schedule) allows you to start other jobs or to clone jobs. By using the
\label{Priority}
\item [Priority = \lt{}number\gt{}]
- \index[dir]{Priority }
+ \index[dir]{Priority}
+ \index[dir]{Directive!Priority}
This directive permits you to control the order in which your jobs run
by specifying a positive non-zero number. The higher the number, the
lower the job priority. Assuming you are not running concurrent jobs,
The default priority is 10.
- If you want to run concurrent jobs, which is not recommended, you should
-keep
- these points in mind:
+ If you want to run concurrent jobs, which is not recommended, you should
+ keep these points in mind:
\begin{itemize}
-\item To run concurrent jobs, you must set Maximum Concurrent Jobs = 2 in 5
- or 6 distinct places: in bacula-dir.conf in the Director, the Job, the
- Client, the Storage resources; in bacula-fd in the FileDaemon (or Client)
- resource, and in bacula-sd.conf in the Storage resource. If any one is
- missing, it will throttle the jobs to one at a time.
-\item Bacula concurrently runs jobs of only one priority at a time. It will
- not simultaneously run a priority 1 and a priority 2 job.
-\item If Bacula is running a priority 2 job and a new priority 1 job is
- scheduled, it will wait until the running priority 2 job terminates even if
- the Maximum Concurrent Jobs settings would otherwise allow two jobs to run
- simultaneously.
-\item Suppose that bacula is running a priority 2 job and a new priority 1 job
- is scheduled and queued waiting for the running priority 2 job to terminate.
- If you then start a second priority 2 job, the waiting priority 1 job will
- prevent the new priority 2 job from running concurrently with the running
- priority 2 job. That is: as long as there is a higher priority job waiting
- to
- run, no new lower priority jobs will start even if the Maximum Concurrent
- Jobs settings would normally allow them to run. This ensures that higher
- priority jobs will be run as soon as possible.
+\item To run concurrent jobs, you must set Maximum Concurrent Jobs = 2 in 5
+ or 6 distinct places: in bacula-dir.conf in the Director, the Job, the
+ Client, the Storage resources; in bacula-fd in the FileDaemon (or
+ Client) resource, and in bacula-sd.conf in the Storage resource. If any
+ one is missing, it will throttle the jobs to one at a time.
+\item Bacula concurrently runs jobs of only one priority at a time. It
+ will not simultaneously run a priority 1 and a priority 2 job.
+\item If Bacula is running a priority 2 job and a new priority 1 job is
+ scheduled, it will wait until the running priority 2 job terminates even
+ if the Maximum Concurrent Jobs settings would otherwise allow two jobs
+ to run simultaneously.
+\item Suppose that bacula is running a priority 2 job and a new priority 1
+ job is scheduled and queued waiting for the running priority 2 job to
+ terminate. If you then start a second priority 2 job, the waiting
+ priority 1 job will prevent the new priority 2 job from running
+ concurrently with the running priority 2 job. That is: as long as there
+ is a higher priority job waiting to run, no new lower priority jobs will
+ start even if the Maximum Concurrent Jobs settings would normally allow
+ them to run. This ensures that higher priority jobs will be run as soon
+ as possible.
\end{itemize}
If you have several jobs of different priority, it may not best to start
\label{WritePartAfterJob}
\item [Write Part After Job = \lt{}yes|no\gt{}]
- \index[dir]{Write Part After Job }
+\index[dir]{Write Part After Job}
+\index[dir]{Directive!Write Part After Job}
This directive is only implemented in version 1.37 and later.
If this directive is set to {\bf yes} (default {\bf no}), a new part file
will be created after the job is finished.
\begin{description}
\item [Schedule]
- \index[dir]{Schedule}
- Start of the Schedule directives. No {\bf Schedule} resource is required,
-but
-you will need at least one if you want Jobs to be automatically started.
+\index[dir]{Schedule}
+\index[dir]{Directive!Schedule}
+ Start of the Schedule directives. No {\bf Schedule} resource is
+ required, but you will need at least one if you want Jobs to be
+ automatically started.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The name of the schedule being defined. The Name directive is required.
\item [Run = \lt{}Job-overrides\gt{} \lt{}Date-time-specification\gt{}]
- \index[dir]{Run }
- The Run directive defines when a Job is to be run, and what overrides if any
-to apply. You may specify multiple {\bf run} directives within a {\bf
-Schedule} resource. If you do, they will all be applied (i.e. multiple
-schedules). If you have two {\bf Run} directives that start at the same time,
-two Jobs will start at the same time (well, within one second of each
-other).
-
-The {\bf Job-overrides} permit overriding the Level, the Storage, the
-Messages, and the Pool specifications provided in the Job resource. In
-addition, the FullPool, the IncrementalPool, and the DifferentialPool
-specifications permit overriding the Pool specification according to what
-backup Job Level is in effect.
-
-By the use of overrides, you may customize a particular Job. For example, you
-may specify a Messages override for your Incremental backups that outputs
-messages to a log file, but for your weekly or monthly Full backups, you may
-send the output by email by using a different Messages override.
-
-{\bf Job-overrides} are specified as: {\bf keyword=value} where the keyword
-is Level, Storage, Messages, Pool, FullPool, DifferentialPool, or
-IncrementalPool, and the {\bf value} is as defined on the respective
-directive formats for the Job resource. You may specify multiple {\bf
-Job-overrides} on one {\bf Run} directive by separating them with one or more
-spaces or by separating them with a trailing comma. For example:
+ \index[dir]{Run}
+ \index[dir]{Directive!Run}
+ The Run directive defines when a Job is to be run, and what overrides if
+ any to apply. You may specify multiple {\bf run} directives within a
+ {\bf Schedule} resource. If you do, they will all be applied (i.e.
+ multiple schedules). If you have two {\bf Run} directives that start at
+ the same time, two Jobs will start at the same time (well, within one
+ second of each other).
+
+ The {\bf Job-overrides} permit overriding the Level, the Storage, the
+ Messages, and the Pool specifications provided in the Job resource. In
+ addition, the FullPool, the IncrementalPool, and the DifferentialPool
+ specifications permit overriding the Pool specification according to
+ what backup Job Level is in effect.
+
+ By the use of overrides, you may customize a particular Job. For
+ example, you may specify a Messages override for your Incremental
+ backups that outputs messages to a log file, but for your weekly or
+ monthly Full backups, you may send the output by email by using a
+ different Messages override.
+
+ {\bf Job-overrides} are specified as: {\bf keyword=value} where the
+ keyword is Level, Storage, Messages, Pool, FullPool, DifferentialPool,
+ or IncrementalPool, and the {\bf value} is as defined on the respective
+ directive formats for the Job resource. You may specify multiple {\bf
+ Job-overrides} on one {\bf Run} directive by separating them with one or
+ more spaces or by separating them with a trailing comma. For example:
\begin{description}
\item [Level=Full]
\index[dir]{Level}
+ \index[dir]{Directive!Level}
is all files in the FileSet whether or not they have changed.
\item [Level=Incremental]
\index[dir]{Level}
+ \index[dir]{Directive!Level}
is all files that have changed since the last backup.
\item [Pool=Weekly]
\index[dir]{Pool}
+ \index[dir]{Directive!Pool}
specifies to use the Pool named {\bf Weekly}.
\item [Storage=DLT\_Drive]
\index[dir]{Storage}
+ \index[dir]{Directive!Storage}
specifies to use {\bf DLT\_Drive} for the storage device.
\item [Messages=Verbose]
\index[dir]{Messages}
+ \index[dir]{Directive!Messages}
specifies to use the {\bf Verbose} message resource for the Job.
\item [FullPool=Full]
\index[dir]{FullPool}
+ \index[dir]{Directive!FullPool}
specifies to use the Pool named {\bf Full} if the job is a full backup, or
is
upgraded from another type to a full backup.
\item [DifferentialPool=Differential]
\index[dir]{DifferentialPool}
+ \index[dir]{Directive!DifferentialPool}
specifies to use the Pool named {\bf Differential} if the job is a
differential backup.
\item [IncrementalPool=Incremental]
\index[dir]{IncrementalPool}
+ \index[dir]{Directive!IncrementalPool}
specifies to use the Pool named {\bf Incremental} if the job is an
incremental backup.
\item [SpoolData=yes|no]
\index[dir]{SpoolData}
+ \index[dir]{Directive!SpoolData}
tells Bacula to request the Storage daemon to spool data to a disk file
before putting it on tape.
\item [WritePartAfterJob=yes|no]
\index[dir]{WritePartAfterJob}
+ \index[dir]{Directive!WritePartAfterJob}
tells Bacula to request the Storage daemon to write the current part file to
the device when the job is finished (see
\ilink{Write Part After Job directive in the Job
\item [Client (or FileDaemon)]
\index[dir]{Client (or FileDaemon)}
+ \index[dir]{Directive!Client (or FileDaemon)}
Start of the Client directives.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The client name which will be used in the Job resource directive or in the
console run command. This directive is required.
\item [Address = \lt{}address\gt{}]
- \index[dir]{Address }
+ \index[dir]{Address}
+ \index[dir]{Directive!Address}
Where the address is a host name, a fully qualified domain name, or a
network
address in dotted quad notation for a Bacula File server daemon. This
directive is required.
\item [FD Port = \lt{}port-number\gt{}]
- \index[dir]{FD Port }
+ \index[dir]{FD Port}
+ \index[dir]{Directive!FD Port}
Where the port is a port number at which the Bacula File server daemon can
be
contacted. The default is 9102.
\item [Catalog = \lt{}Catalog-resource-name\gt{}]
- \index[dir]{Catalog }
+ \index[dir]{Catalog}
+ \index[dir]{Directive!Catalog}
This specifies the name of the catalog resource to be used for this Client.
This directive is required.
\item [Password = \lt{}password\gt{}]
- \index[dir]{Password }
+ \index[dir]{Password}
+ \index[dir]{Directive!Password}
This is the password to be used when establishing a connection with the File
services, so the Client configuration file on the machine to be backed up
must have the same password defined for this Director. This directive is
\label{FileRetention}
\item [File Retention = \lt{}time-period-specification\gt{}]
- \index[dir]{File Retention }
+ \index[dir]{File Retention}
+ \index[dir]{Directive!File Retention}
The File Retention directive defines the length of time that Bacula will
keep
File records in the Catalog database. When this time period expires, and if
\label{JobRetention}
\item [Job Retention = \lt{}time-period-specification\gt{}]
- \index[dir]{Job Retention }
+ \index[dir]{Job Retention}
+ \index[dir]{Directive!Job Retention}
The Job Retention directive defines the length of time that Bacula will keep
Job records in the Catalog database. When this time period expires, and if
{\bf AutoPrune} is set to {\bf yes} Bacula will prune (remove) Job records
\label{AutoPrune}
\item [AutoPrune = \lt{}yes|no\gt{}]
- \index[dir]{AutoPrune }
+ \index[dir]{AutoPrune}
+ \index[dir]{Directive!AutoPrune}
If AutoPrune is set to {\bf yes} (default), Bacula (version 1.20 or greater)
will automatically apply the File retention period and the Job retention
period for the Client at the end of the Job. If you set {\bf AutoPrune = no},
stored in the backup archives (on Volumes).
\item [Maximum Concurrent Jobs = \lt{}number\gt{}]
- \index[dir]{Maximum Concurrent Jobs }
+ \index[dir]{Maximum Concurrent Jobs}
+ \index[dir]{Directive!Maximum Concurrent Jobs}
where \lt{}number\gt{} is the maximum number of Jobs with the current Client
that can run concurrently. Note, this directive limits only Jobs for Clients
with the same name as the resource in which it appears. Any other
\ilink{ Maximum Concurrent Jobs}{DirMaxConJobs} in the Director's
resource.
-\item [*Priority = \lt{}number\gt{}]
- \index[dir]{*Priority }
+\item [Priority = \lt{}number\gt{}]
+ \index[dir]{Priority}
+ \index[dir]{Directive!Priority}
The number specifies the priority of this client relative to other clients
-that the Director is processing simultaneously. The priority can range from
-1 to 1000. The clients are ordered such that the smaller number priorities
-are performed first (not currently implemented).
+ that the Director is processing simultaneously. The priority can range from
+ 1 to 1000. The clients are ordered such that the smaller number priorities
+ are performed first (not currently implemented).
\end{description}
The following is an example of a valid Client resource definition:
\item [Storage]
\index[dir]{Storage}
+ \index[dir]{Directive!Storage}
Start of the Storage resources. At least one storage resource must be
specified.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The name of the storage resource. This name appears on the Storage directive
specified in the Job resource and is required.
\item [Address = \lt{}address\gt{}]
- \index[dir]{Address }
+ \index[dir]{Address}
+ \index[dir]{Directive!Address}
Where the address is a host name, a {\bf fully qualified domain name}, or an
{\bf IP address}. Please note that the \lt{}address\gt{} as specified here
will be transmitted to the File daemon who will then use it to contact the
directive is required.
\item [SD Port = \lt{}port\gt{}]
- \index[dir]{SD Port }
+ \index[dir]{SD Port}
+ \index[dir]{Directive!SD Port}
Where port is the port to use to contact the storage daemon for information
and to start jobs. This same port number must appear in the Storage resource
of the Storage daemon's configuration file. The default is 9103.
\item [Password = \lt{}password\gt{}]
\index[dir]{Password}
+ \index[dir]{Directive!Password}
This is the password to be used when establishing a connection with the
Storage services. This same password also must appear in the Director
resource of the Storage daemon's configuration file. This directive is
\item [Device = \lt{}device-name\gt{}]
\index[dir]{Device}
- This directive specifies the name of the device to be used for the
- storage. This name is not the physical device name, but the logical
- device name as defined on the {\bf Name} directive contained in the {\bf
- Device} resource definition of the {\bf Storage daemon} configuration
- file or if the device is an Autochanger, you must put the name as
- defined on the {\bf Name} directive contained in the {\bf Autochanger}
- resource definition of the {\bf Storage daemon}. You can specify any
- name you would like (even the device name if you prefer) up to a maximum
- of 127 characters in length. The physical device name associated with
- this device is specified in the {\bf Storage daemon} configuration file
- (as {\bf Archive Device}). Please take care not to define two different
- Storage resource directives in the Director that point to the same
- Device in the Storage daemon. Doing so may cause the Storage daemon to
- block (or hang) attempting to open the same device that is already open.
+ \index[dir]{Directive!Device}
+ This directive specifies the Storage daemon's name of the device resource
+ to be used for the storage. This name is not the physical device name, but
+ the logical device name as defined on the {\bf Name} directive contained in
+ the {\bf Device} resource definition of the {\bf Storage daemon}
+ configuration file or if the device is an Autochanger, you must put the
+ name as defined on the {\bf Name} directive contained in the {\bf
+ Autochanger} resource definition of the {\bf Storage daemon}. You can
+ specify any name you would like (even the device name if you prefer) up to
+ a maximum of 127 characters in length. The physical device name associated
+ with this device is specified in the {\bf Storage daemon} configuration
+ file (as {\bf Archive Device}). Please take care not to define two
+ different Storage resource directives in the Director that point to the
+ same Device in the Storage daemon. Doing so may cause the Storage daemon
+ to block (or hang) attempting to open the same device that is already open.
This directive is required.
\label{MediaType}
\item [Media Type = \lt{}MediaType\gt{}]
\index[dir]{Media Type}
+ \index[dir]{Directive!Media Type}
This directive specifies the Media Type to be used to store the data.
This is an arbitrary string of characters up to 127 maximum that you
define. It can be anything you want. However, it is best to make it
associated with the Job, Bacula can decide to use any Storage daemon
that supports Media Type {\bf DDS-4} and on any drive that supports it.
+ Currently Bacula permits only a single Media Type. Consequently, if
+ you have a drive that supports more than one Media Type, you can
+ give a unique string to Volumes with different intrinsic Media
+ Type (Media Type = DDS-3-4 for DDS-3 and DDS-4 types), but then
+ those volumes will only be mounted on drives indicated with the
+ dual type (DDS-3-4).
+
If you want to tie Bacula to using a single Storage daemon or drive, you
must specify a unique Media Type for that drive. This is an important
point that should be carefully understood. Note, this applies equally
\label{Autochanger1}
\item [Autochanger = \lt{}yes|no\gt{}]
- \index[dir]{Autochanger }
+ \index[dir]{Autochanger}
+ \index[dir]{Directive!Autochanger}
If you specify {\bf yes} for this command (the default is {\bf no}), when
you use the {\bf label} command or the {\bf add} command to create a new
Volume, {\bf Bacula} will also request the Autochanger Slot number.
using autochangers.
\item [Maximum Concurrent Jobs = \lt{}number\gt{}]
- \index[dir]{Maximum Concurrent Jobs }
+ \index[dir]{Maximum Concurrent Jobs}
+ \index[dir]{Directive!Maximum Concurrent Jobs}
where \lt{}number\gt{} is the maximum number of Jobs with the current
Storage
resource that can run concurrently. Note, this directive limits only Jobs
for Jobs using this Storage daemon. Any other restrictions on the maximum
concurrent jobs such as in the Director, Job, or Client resources will also
apply in addition to any limit specified here. The default is set to 1, but
-you may set it to a larger number. We strongly recommend that you read the
-WARNING documented under
-\ilink{ Maximum Concurrent Jobs}{DirMaxConJobs} in the Director's
-resource.
-
-While it is possible to set the Director's, Job's, or Client's maximum
-concurrent jobs greater than one, you should take great care in setting the
-Storage daemon's greater than one. By keeping this directive set to one, you
-will avoid having two jobs simultaneously write to the same Volume. Although
-this is supported, it is not currently recommended.
+you may set it to a larger number. However, if you set the Storage
+daemon's number of concurrent jobs greater than one,
+we recommend that you read the
+waring documented under \ilink{Maximum Concurrent Jobs}{DirMaxConJobs}
+in the Director's resource or simply turn data spooling on as documented
+in the \ilink{Data Spooling}{SpoolingChapter} chapter of this manual.
\end{description}
The following is an example of a valid Storage resource definition:
\item [Pool]
\index[dir]{Pool}
+ \index[dir]{Directive!Pool}
Start of the Pool resource. There must be at least one Pool resource
defined.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The name of the pool. For most applications, you will use the default
pool name {\bf Default}. This directive is required.
\label{MaxVolumes}
\item [Maximum Volumes = \lt{}number\gt{}]
- \index[dir]{Maximum Volumes }
+ \index[dir]{Maximum Volumes}
+ \index[dir]{Directive!Maximum Volumes}
This directive specifies the maximum number of volumes (tapes or files)
contained in the pool. This directive is optional, if omitted or set to
zero, any number of volumes will be permitted. In general, this
made to disk files do not become too numerous or consume too much space.
\item [Pool Type = \lt{}type\gt{}]
- \index[dir]{Pool Type }
+ \index[dir]{Pool Type}
+ \index[dir]{Directive!Pool Type}
This directive defines the pool type, which corresponds to the type of
Job being run. It is required and may be one of the following:
\end{itemize}
\item [Use Volume Once = \lt{}yes|no\gt{}]
- \index[dir]{Use Volume Once }
+ \index[dir]{Use Volume Once}
+ \index[dir]{Directive!Use Volume Once}
This directive if set to {\bf yes} specifies that each volume is to be
used only once. This is most useful when the Media is a file and you
want a new file for each backup that is done. The default is {\bf no}
Volume you must use the {\bf update} command in the Console.
\item [Maximum Volume Jobs = \lt{}positive-integer\gt{}]
- \index[dir]{Maximum Volume Jobs }
+ \index[dir]{Maximum Volume Jobs}
+ \index[dir]{Directive!Maximum Volume Jobs}
This directive specifies the maximum number of Jobs that can be written
to the Volume. If you specify zero (the default), there is no limit.
Otherwise, when the number of Jobs backed up to the Volume equals {\bf
must use the {\bf update} command in the Console.
\item [Maximum Volume Files = \lt{}positive-integer\gt{}]
- \index[dir]{Maximum Volume Files }
+ \index[dir]{Maximum Volume Files}
+ \index[dir]{Directive!Maximum Volume Files}
This directive specifies the maximum number of files that can be written
to the Volume. If you specify zero (the default), there is no limit.
Otherwise, when the number of files written to the Volume equals {\bf
Volume you must use the {\bf update} command in the Console.
\item [Maximum Volume Bytes = \lt{}size\gt{}]
- \index[dir]{Maximum Volume Bytes }
+ \index[dir]{Maximum Volume Bytes}
+ \index[dir]{Directive!Maximum Volume Bytes}
This directive specifies the maximum number of bytes that can be written
to the Volume. If you specify zero (the default), there is no limit
except the physical size of the Volume. Otherwise, when the number of
Volume you must use the {\bf update} command in the Console.
\item [Volume Use Duration = \lt{}time-period-specification\gt{}]
- \index[dir]{Volume Use Duration }
+ \index[dir]{Volume Use Duration}
+ \index[dir]{Directive!Volume Use Duration}
The Volume Use Duration directive defines the time period that the
Volume can be written beginning from the time of first data write to the
Volume. If the time-period specified is zero (the default), the Volume
\ilink{\bf update volume}{UpdateCommand} command in the Console.
\item [Catalog Files = \lt{}yes|no\gt{}]
- \index[dir]{Catalog Files }
+ \index[dir]{Catalog Files}
+ \index[dir]{Directive!Catalog Files}
This directive defines whether or not you want the names of the files
that were saved to be put into the catalog. The default is {\bf yes}.
The advantage of specifying {\bf Catalog Files = No} is that you will
\label{PoolAutoPrune}
\item [AutoPrune = \lt{}yes|no\gt{}]
- \index[dir]{AutoPrune }
+ \index[dir]{AutoPrune}
+ \index[dir]{Directive!AutoPrune}
If AutoPrune is set to {\bf yes} (default), Bacula (version 1.20 or greater)
will automatically apply the Volume Retention period when new Volume is
needed and no appendable Volumes exist in the Pool. Volume pruning causes
\label{VolRetention}
\item [Volume Retention = \lt{}time-period-specification\gt{}]
- \index[dir]{Volume Retention }
+ \index[dir]{Volume Retention}
+ \index[dir]{Directive!Volume Retention}
The Volume Retention directive defines the length of time that {\bf
Bacula} will keep Job records associated with the Volume in the Catalog
database. When this time period expires, and if {\bf AutoPrune} is set
\label{PoolRecycle}
\item [Recycle = \lt{}yes|no\gt{}]
- \index[dir]{Recycle }
+ \index[dir]{Recycle}
+ \index[dir]{Directive!Recycle}
This directive specifies whether or not Purged Volumes may be recycled.
If it is set to {\bf yes} (default) and Bacula needs a volume but finds
none that are appendable, it will search for and recycle (reuse) Purged
\label{RecycleOldest}
\item [Recycle Oldest Volume = \lt{}yes|no\gt{}]
- \index[dir]{Recycle Oldest Volume }
+ \index[dir]{Recycle Oldest Volume}
+ \index[dir]{Directive!Recycle Oldest Volume}
This directive instructs the Director to search for the oldest used
Volume in the Pool when another Volume is requested by the Storage
daemon and none are available. The catalog is then {\bf pruned}
\label{RecycleCurrent}
\item [Recycle Current Volume = \lt{}yes|no\gt{}]
- \index[dir]{Recycle Current Volume }
+ \index[dir]{Recycle Current Volume}
+ \index[dir]{Directive!Recycle Current Volume}
If Bacula needs a new Volume, this directive instructs Bacula to Prune
the volume respecting the Job and File retention periods. If all Jobs
are pruned (i.e. the volume is Purged), then the Volume is recycled and
\label{PurgeOldest}
\item [Purge Oldest Volume = \lt{}yes|no\gt{}]
- \index[dir]{Purge Oldest Volume }
+ \index[dir]{Purge Oldest Volume}
+ \index[dir]{Directive!Purge Oldest Volume}
This directive instructs the Director to search for the oldest used
Volume in the Pool when another Volume is requested by the Storage
daemon and none are available. The catalog is then {\bf purged}
data. The default is {\bf no}.
\item [Cleaning Prefix = \lt{}string\gt{}]
- \index[dir]{Cleaning Prefix }
+ \index[dir]{Cleaning Prefix}
+ \index[dir]{Directive!Cleaning Prefix}
This directive defines a prefix string, which if it matches the
beginning of a Volume name during labeling of a Volume, the Volume will
be defined with the VolStatus set to {\bf Cleaning} and thus Bacula will
\label{Label}
\item [Label Format = \lt{}format\gt{}]
- \index[dir]{Label Format }
+ \index[dir]{Label Format}
+ \index[dir]{Directive!Label Format}
This directive specifies the format of the labels contained in this
pool. The format directive is used as a sort of template to create new
Volume names during automatic Volume labeling.
\item [Catalog]
\index[dir]{Catalog}
+ \index[dir]{Directive!Catalog}
Start of the Catalog resource. At least one Catalog resource must be
defined.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The name of the Catalog. No necessary relation to the database server
name. This name will be specified in the Client resource directive
indicating that all catalog data for that Client is maintained in this
Catalog. This directive is required.
\item [password = \lt{}password\gt{}]
- \index[dir]{password }
+ \index[dir]{password}
+ \index[dir]{Directive!password}
This specifies the password to use when logging into the database. This
directive is required.
\item [DB Name = \lt{}name\gt{}]
- \index[dir]{DB Name }
+ \index[dir]{DB Name}
+ \index[dir]{Directive!DB Name}
This specifies the name of the database. If you use multiple catalogs
(databases), you specify which one here. If you are using an external
database server rather than the internal one, you must specify a name
tables using this name. This directive is required.
\item [user = \lt{}user\gt{}]
- \index[dir]{user }
+ \index[dir]{user}
+ \index[dir]{Directive!user}
This specifies what user name to use to log into the database. This
directive is required.
\item [DB Socket = \lt{}socket-name\gt{}]
- \index[dir]{DB Socket }
+ \index[dir]{DB Socket}
+ \index[dir]{Directive!DB Socket}
This is the name of a socket to use on the local host to connect to the
database. This directive is used only by MySQL and is ignored by SQLite.
Normally, if neither {\bf DB Socket} or {\bf DB Address} are specified, MySQL
will use the default socket.
\item [DB Address = \lt{}address\gt{}]
- \index[dir]{DB Address }
+ \index[dir]{DB Address}
+ \index[dir]{Directive!DB Address}
This is the host address of the database server. Normally, you would specify
this instead of {\bf DB Socket} if the database server is on another machine.
In that case, you will also specify {\bf DB Port}. This directive is used
optional.
\item [DB Port = \lt{}port\gt{}]
- \index[dir]{DB Port }
+ \index[dir]{DB Port}
+ \index[dir]{Directive!DB Port}
This defines the port to be used in conjunction with {\bf DB Address} to
access the database if it is on another machine. This directive is used only
by MySQL and is ignored by SQLite if provided. This directive is optional.
%% \item [Multiple Connections = \lt{}yes|no\gt{}]
-%% \index[dir]{Multiple Connections }
+%% \index[dir]{Multiple Connections}
+%% \index[dir]{Directive!Multiple Connections}
%% By default, this directive is set to no. In that case, each job that uses
the
%% same Catalog will use a single connection to the catalog. It will be shared,
\begin{description}
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The name of the console. This name must match the name specified in the
Console's configuration resource (much as is the case with Client
definitions).
\item [Password = \lt{}password\gt{}]
- \index[dir]{Password }
+ \index[dir]{Password}
+ \index[dir]{Directive!Password}
Specifies the password that must be supplied for a named Bacula Console
to be authorized. The same password must appear in the {\bf Console}
resource of the Console configuration file. For added security, the
process, otherwise it will be left blank.
\item [JobACL = \lt{}name-list\gt{}]
- \index[dir]{JobACL }
+ \index[dir]{JobACL}
+ \index[dir]{Directive!JobACL}
This directive is used to specify a list of Job resource names that can
be accessed by the console. Without this directive, the console cannot
access any of the Director's Job resources. Multiple Job resource names
for the four jobs named on the JobACL directives, but for no others.
\item [ClientACL = \lt{}name-list\gt{}]
- \index[dir]{ClientACL }
+ \index[dir]{ClientACL}
+ \index[dir]{Directive!ClientACL}
This directive is used to specify a list of Client resource names that can
be
accessed by the console.
\item [StorageACL = \lt{}name-list\gt{}]
- \index[dir]{StorageACL }
+ \index[dir]{StorageACL}
+ \index[dir]{Directive!StorageACL}
This directive is used to specify a list of Storage resource names that can
be accessed by the console.
\item [ScheduleACL = \lt{}name-list\gt{}]
- \index[dir]{ScheduleACL }
+ \index[dir]{ScheduleACL}
+ \index[dir]{Directive!ScheduleACL}
This directive is used to specify a list of Schedule resource names that can
be accessed by the console.
\item [PoolACL = \lt{}name-list\gt{}]
- \index[dir]{PoolACL }
+ \index[dir]{PoolACL}
+ \index[dir]{Directive!PoolACL}
This directive is used to specify a list of Pool resource names that can be
accessed by the console.
\item [FileSetACL = \lt{}name-list\gt{}]
- \index[dir]{FileSetACL }
+ \index[dir]{FileSetACL}
+ \index[dir]{Directive!FileSetACL}
This directive is used to specify a list of FileSet resource names that can
be accessed by the console.
\item [CatalogACL = \lt{}name-list\gt{}]
- \index[dir]{CatalogACL }
+ \index[dir]{CatalogACL}
+ \index[dir]{Directive!CatalogACL}
This directive is used to specify a list of Catalog resource names that can
be accessed by the console.
\item [CommandACL = \lt{}name-list\gt{}]
- \index[dir]{CommandACL }
+ \index[dir]{CommandACL}
+ \index[dir]{Directive!CommandACL}
This directive is used to specify a list of of console commands that can be
executed by the console.
\end{description}
\item [Counter]
\index[dir]{Counter}
+ \index[dir]{Directive!Counter}
Start of the Counter resource. Counter directives are optional.
\item [Name = \lt{}name\gt{}]
- \index[dir]{Name }
+ \index[dir]{Name}
+ \index[dir]{Directive!Name}
The name of the Counter. This is the name you will use in the variable
expansion to reference the counter value.
\item [Minimum = \lt{}integer\gt{}]
- \index[dir]{Minimum }
+ \index[dir]{Minimum}
+ \index[dir]{Directive!Minimum}
This specifies the minimum value that the counter can have. It also becomes
the default. If not supplied, zero is assumed.
\item [Maximum = \lt{}integer\gt{}]
- \index[dir]{Maximum }
+ \index[dir]{Maximum}
+ \index[dir]{Directive!Maximum}
+ \index[dir]{Directive!Maximum}
This is the maximum value value that the counter can have. If not specified
or set to zero, the counter can have a maximum value of 2,147,483,648 (2 to
the 31 power). When the counter is incremented past this value, it is reset
to the Minimum.
\item [*WrapCounter = \lt{}counter-name\gt{}]
- \index[dir]{*WrapCounter }
+ \index[dir]{*WrapCounter}
+ \index[dir]{Directive!*WrapCounter}
If this value is specified, when the counter is incremented past the
maximum
and thus reset to the minimum, the counter specified on the {\bf WrapCounter}
is incremented. (This is not currently implemented).
\item [Catalog = \lt{}catalog-name\gt{}]
- \index[dir]{Catalog }
+ \index[dir]{Catalog}
+ \index[dir]{Directive!Catalog}
If this directive is specified, the counter and its values will be saved in
the specified catalog. If this directive is not present, the counter will be
redefined each time that Bacula is started.
FileSet {
Name = "Full Set"
Include {
- Options { signature=SHA1 }
+ Options { signature=SHA1}
#
# Put your list of files here, one per line or include an
# external list with:
#
# Note: / backs up everything
File = /
- }
+}
Exclude {}
}
# When to do the backups