process, otherwise it will be left blank.
\item
\ilink{Job}{JobResource} -- to define the backup/restore Jobs
- and to tie together the Client, FileSet and Schedule resources to be used for
+ and to tie together the Client, FileSet and Schedule resources to be used
+for
each Job.
\item
\ilink{JobDefs}{JobDefsResource} -- optional resource for
\item [Scripts Directory = \lt{}Directory\gt{}]
\index[dir]{Scripts Directory }
- This directive is optional and, if defined, specifies a directory in which the Director
+ This directive is optional and, if defined, specifies a directory in which
+the Director
will look for the Python startup script {\bf DirStartup.py}. This directory
may be shared by other Bacula daemons. Standard shell expansion of the
directory is done when the configuration file is read so that values such
\item [FD Connect Timeout = \lt{}time\gt{}]
\index[dir]{FD Connect Timeout }
- where {\bf time} is the time that the Director should continue attempting to
+ where {\bf time} is the time that the Director should continue attempting
+to
contact the File daemon to start a job, and after which the Director will
cancel the job. The default is 30 minutes.
\item [SD Connect Timeout = \lt{}time\gt{}]
\index[dir]{SD Connect Timeout }
- where {\bf time} is the time that the Director should continue attempting to
+ where {\bf time} is the time that the Director should continue attempting
+to
contact the Storage daemon to start a job, and after which the Director will
cancel the job. The default is 30 minutes.
\item [DirAddresses = \lt{}IP-address-specification\gt{}]
\index[dir]{DirAddresses }
Specify the ports and addresses on which the Director daemon will listen for
-Bacula Console connections. Probably the simplest way to explain this is to show
+Bacula Console connections. Probably the simplest way to explain this is to
+show
an example:
\footnotesize
\addcontentsline{toc}{subsection}{Job Resource}
The Job resource defines a Job (Backup, Restore, ...) that Bacula must
-perform. Each Job resource definition contains the names of the Clients and
-their FileSets to backup or restore, the Schedule for the Job, where the data
+perform. Each Job resource definition contains the name of a Client and
+a FileSet to backup, the Schedule for the Job, where the data
are to be stored, and what media Pool can be used. In effect, each Job
resource must specify What, Where, How, and When or FileSet, Storage,
-Backup/Restore/Level, and Schedule respectively.
+Backup/Restore/Level, and Schedule respectively. Note, the FileSet must
+be specified for a restore job for historical reasons, but it is no longer used.
Only a single type ({\bf Backup}, {\bf Restore}, ...) can be specified for any
job. If you want to backup multiple FileSets on the same Client or multiple
The Job name. This name can be specified on the {\bf Run} command in the
console program to start a job. If the name contains spaces, it must be
specified between quotes. It is generally a good idea to give your job the
- same name as the Client that it will backup. This permits easy identification
+ same name as the Client that it will backup. This permits easy
+identification
of jobs.
When the job actually runs, the unique Job Name will consist of the name you
\item [Restore]
\index[dir]{Restore }
- Run a restore Job. Normally, you will specify only one Restore job which acts
+ Run a restore Job. Normally, you will specify only one Restore job which
+acts
as a sort of prototype that you will modify using the console program in
order to perform restores. Although certain basic information from a Restore
job is saved in the catalog, it is very minimal compared to the information
\item [Level = \lt{}job-level\gt{}]
\index[dir]{Level }
- The Level directive specifies the default Job level to be run. Each different
+ The Level directive specifies the default Job level to be run. Each
+different
Job Type (Backup, Restore, ...) has a different set of Levels that can be
specified. The Level is normally overridden by a different value that is
specified in the {\bf Schedule} resource. This directive is not required, but
st\_ctime to change and hence Bacula will backup the file during an
Incremental or Differential backup. In the case of Sophos virus scanning, you
can prevent it from resetting the access time (st\_atime) and hence changing
-st\_ctime by using the {\bf \verb:--:no-reset-atime} option. For other software,
+st\_ctime by using the {\bf \verb:--:no-reset-atime} option. For other
+software,
please see their manual.
When Bacula does an Incremental backup, all modified files that are still on
process and not currently implemented in Bacula.
In addition, if you move a directory rather than copy it, the files in it do not
-have their modification time (st\_mtime) or their attribute change time (st\_ctime)
-changed. As a conseqence, those files will probably not be backed up by an Incremental
-or Differential backup which depend solely on these time stamps. If you move a directory,
-and which it to be properly backed up, it is generally preferable to copy it then
+have their modification time (st\_mtime) or their attribute change time
+(st\_ctime)
+changed. As a conseqence, those files will probably not be backed up by an
+Incremental
+or Differential backup which depend solely on these time stamps. If you move a
+directory,
+and which it to be properly backed up, it is generally preferable to copy it
+then
delete the original.
\item [Differential]
files from the catalog during a Differential backup is quite a time consuming
process and not currently implemented in Bacula.
-As noted above, if you move a directory rather than copy it, the files in it do not
-have their modification time (st\_mtime) or their attribute change time (st\_ctime)
-changed. As a conseqence, those files will probably not be backed up by an Incremental
-or Differential backup which depend solely on these time stamps. If you move a directory,
-and which it to be properly backed up, it is generally preferable to copy it then
+As noted above, if you move a directory rather than copy it, the files in it do
+not
+have their modification time (st\_mtime) or their attribute change time
+(st\_ctime)
+changed. As a conseqence, those files will probably not be backed up by an
+Incremental
+or Differential backup which depend solely on these time stamps. If you move a
+directory,
+and which it to be properly backed up, it is generally preferable to copy it
+then
delete the original.
\end{description}
\item [DiskToCatalog]
\index[dir]{DiskToCatalog }
- This level causes Bacula to read the files as they currently are on disk, and
+ This level causes Bacula to read the files as they currently are on disk,
+and
to compare the current file attributes with the attributes saved in the
catalog from the last backup for the job specified on the {\bf VerifyJob}
directive. This level differs from the {\bf Catalog} level described above by
\item [FileSet = \lt{}FileSet-resource-name\gt{}]
\index[dir]{FileSet }
- The FileSet directive specifies the FileSet that will be used in the current
+ The FileSet directive specifies the FileSet that will be used in the
+current
Job. The FileSet specifies which directories (or files) are to be backed up,
and what options to use (e.g. compression, ...). Only a single FileSet
resource may be specified in any one Job. For additional details, see the
\item [Messages = \lt{}messages-resource-name\gt{}]
\index[dir]{Messages }
- The Messages directive defines what Messages resource should be used for this
+ The Messages directive defines what Messages resource should be used for
+this
job, and thus how and where the various messages are to be delivered. For
example, you can direct some messages to a log file, and others can be sent
by email. For additional details, see the
\item [Incremental Backup Pool = \lt{}pool-resource-name\gt{}]
\index[dir]{Incremental Backup Pool }
- The {\it Incremental Backup Pool} specifies a Pool to be used for Incremental
- backups. It will override any Pool specification during an Incremental backup.
+ The {\it Incremental Backup Pool} specifies a Pool to be used for
+Incremental
+ backups. It will override any Pool specification during an Incremental
+backup.
This resource is optional.
\item [Schedule = \lt{}schedule-name\gt{}]
\index[dir]{Schedule }
The Schedule directive defines what schedule is to be used for the Job. The
schedule determines when the Job will be automatically started and what Job
- level (i.e. Full, Incremental, ...) is to be run. This directive is optional,
+ level (i.e. Full, Incremental, ...) is to be run. This directive is
+optional,
and if left out, the Job can only be started manually. For additional
details, see the
\ilink{Schedule Resource Chapter}{ScheduleResource} of this
- manual. If a Schedule resource is specified, the job will be run according to
+ manual. If a Schedule resource is specified, the job will be run according
+to
the schedule specified. If no Schedule resource is specified for the Job,
the job must be manually started using the Console program. Although you may
specify only a single Schedule resource for any one job, the Schedule
\item [Storage = \lt{}storage-resource-name\gt{}]
\index[dir]{Storage }
- The Storage directive defines the name of the storage services where you want
+ The Storage directive defines the name of the storage services where you
+want
to backup the FileSet data. For additional details, see the
\ilink{Storage Resource Chapter}{StorageResource2} of this manual.
This directive is required.
Bacula checks the exit status of the RunBeforeJob
program. If it is non-zero, the job will be error terminated. Lutz Kittler
- has pointed out that this can be a simple way to modify your schedules during
- a holiday. For example, suppose that you normally do Full backups on Fridays,
- but Thursday and Friday are holidays. To avoid having to change tapes between
+ has pointed out that this can be a simple way to modify your schedules
+during
+ a holiday. For example, suppose that you normally do Full backups on
+Fridays,
+ but Thursday and Friday are holidays. To avoid having to change tapes
+between
Thursday and Friday when no one is in the office, you can create a
- RunBeforeJob that returns a non-zero status on Thursday and zero on all other
+ RunBeforeJob that returns a non-zero status on Thursday and zero on all
+other
days. That way, the Thursday job will not run, and on Friday the tape you
inserted on Wednesday before leaving will be used.
\item [Run After Job = \lt{}command\gt{}]
\index[dir]{Run After Job }
The specified {\bf command} is run as an external program after the current
- job terminates. This directive is not required. The command string must be a
- valid program name or name of a shell script. If the exit code of the program
+ job terminates. This directive is not required. The command string must be
+a
+ valid program name or name of a shell script. If the exit code of the
+program
run is non-zero, the current Bacula job will terminate in error. Before
submitting the specified command to the operating system, Bacula performs
character substitution as described above for the {\bf Run Before Job}
The special characters \&()[]\{\}\^{}=;!'+,`\~{} will need to be quoted if
they are part of a filename or argument.
- If someone is logged in, a blank ``command'' window running the commands will
+ If someone is logged in, a blank ``command'' window running the commands
+will
be present during the execution of the command.
Some Suggestions from Phil Stracchino for running on Win32 machines with the
\begin{enumerate}
\item You might want the ClientRunBeforeJob directive to specify a .bat file
- which runs the actual client-side commands, rather than trying to run (for
+ which runs the actual client-side commands, rather than trying to run
+(for
example) regedit /e directly.
\item The batch file should explicitly 'exit 0' on successful completion.
\item The path to the batch file should be specified in Unix form:
rather than DOS/Windows form:
ClientRunBeforeJob =
- ``c:\textbackslash{}bacula\textbackslash{}bin\textbackslash{}systemstate.bat''
+
+``c:\textbackslash{}bacula\textbackslash{}bin\textbackslash{}systemstate.bat''
INCORRECT
\end{enumerate}
The following example of the use of the Client Run Before Job directive was
submitted by a user:\\
-You could write a shell script to back up a DB2 database to a FIFO. The shell script is:
+You could write a shell script to back up a DB2 database to a FIFO. The shell
+script is:
\footnotesize
\begin{verbatim}
The following line in the Job resoure in the bacula-dir.conf file:
\footnotesize
\begin{verbatim}
- Client Run Before Job = "su - mercuryd -c \"/u01/mercuryd/backupdb.sh '%t' '%l'\""
+ Client Run Before Job = "su - mercuryd -c \"/u01/mercuryd/backupdb.sh '%t'
+'%l'\""
\end{verbatim}
\normalsize
- When the job is run, you will get messages from the output of the script stating
+ When the job is run, you will get messages from the output of the script
+stating
that the backup has started. Even though the command being run is
- backgrounded with \&, the job will block until the "db2 BACKUP DATABASE" command,
+ backgrounded with \&, the job will block until the "db2 BACKUP DATABASE"
+command,
thus the backup stalls.
- To remedy this situation, the "db2 BACKUP DATABASE" line should be changed to the following:
+ To remedy this situation, the "db2 BACKUP DATABASE" line should be changed to
+the following:
\footnotesize
\begin{verbatim}
- db2 BACKUP DATABASE mercuryd TO $DIR/dbpipe WITHOUT PROMPTING > $DIR/backup.log 2>&1 < /dev/null &
+ db2 BACKUP DATABASE mercuryd TO $DIR/dbpipe WITHOUT PROMPTING > $DIR/backup.log
+2>&1 < /dev/null &
\end{verbatim}
\normalsize
\item [Client Run After Job = \lt{}command\gt{}]
\index[dir]{Client Run After Job }
- This command is the same as {\bf Run After Job} except that it is run on the
+ This command is the same as {\bf Run After Job} except that it is run on
+the
client machine. Note, please see the notes above in {\bf Client Run Before
Job} concerning Windows clients.
\item [Rerun Failed Levels = \lt{}yes|no\gt{}]
\index[dir]{Rerun Failed Levels }
- If this directive is set to {\bf yes} (default no), and Bacula detects that a
+ If this directive is set to {\bf yes} (default no), and Bacula detects that
+a
previous job at a higher level (i.e. Full or Differential) has failed, the
current job level will be upgraded to the higher level. This is particularly
useful for Laptops where they may often be unreachable, and if a prior Full
- save has failed, you wish the very next backup to be a Full save rather than
+ save has failed, you wish the very next backup to be a Full save rather
+than
whatever level it is started as.
\item [Spool Data = \lt{}yes|no\gt{}]
\item [Spool Attributes = \lt{}yes|no\gt{}]
\index[dir]{Spool Attributes }
- The default is set to {\bf no}, which means that the File attributes are sent
+ The default is set to {\bf no}, which means that the File attributes are
+sent
by the Storage daemon to the Director as they are stored on tape. However,
if you want to avoid the possibility that database updates will slow down
writing to the tape, you may want to set the value to {\bf yes}, in which
\item [always]
\index[dir]{always }
- when the file to be restored already exists, it is deleted and then replaced by
+ when the file to be restored already exists, it is deleted and then replaced
+by
the copy that was backed up.
\item [ifnewer]
\index[dir]{Reschedule Interval }
If you have specified {\bf Reschedule On Error = yes} and the job
terminates in error, it will be rescheduled after the interval of time
- specified by {\bf time-specification}. See \ilink{ the time
+ specified by {\bf time-specification}. See \ilink{the time
specification formats}{Time} in the Configure chapter for details of
time specifications. If no interval is specified, the job will not be
rescheduled on error.
The default priority is 10.
- If you want to run concurrent jobs, which is not recommended, you should keep
+ If you want to run concurrent jobs, which is not recommended, you should
+keep
these points in mind:
\begin{itemize}
is scheduled and queued waiting for the running priority 2 job to terminate.
If you then start a second priority 2 job, the waiting priority 1 job will
prevent the new priority 2 job from running concurrently with the running
- priority 2 job. That is: as long as there is a higher priority job waiting to
+ priority 2 job. That is: as long as there is a higher priority job waiting
+to
run, no new lower priority jobs will start even if the Maximum Concurrent
Jobs settings would normally allow them to run. This ensures that higher
priority jobs will be run as soon as possible.
\item [Schedule]
\index[dir]{Schedule }
- Start of the Schedule directives. No {\bf Schedule} resource is required, but
+ Start of the Schedule directives. No {\bf Schedule} resource is required,
+but
you will need at least one if you want Jobs to be automatically started.
\item [Name = \lt{}name\gt{}]
\item [FullPool=Full]
\index[dir]{FullPool }
- specifies to use the Pool named {\bf Full} if the job is a full backup, or is
+ specifies to use the Pool named {\bf Full} if the job is a full backup, or
+is
upgraded from another type to a full backup.
\item [DifferentialPool=Differential]
\item [Address = \lt{}address\gt{}]
\index[dir]{Address }
- Where the address is a host name, a fully qualified domain name, or a network
+ Where the address is a host name, a fully qualified domain name, or a
+network
address in dotted quad notation for a Bacula File server daemon. This
directive is required.
\item [FD Port = \lt{}port-number\gt{}]
\index[dir]{FD Port }
- Where the port is a port number at which the Bacula File server daemon can be
+ Where the port is a port number at which the Bacula File server daemon can
+be
contacted. The default is 9102.
\item [Catalog = \lt{}Catalog-resource-name\gt{}]
\item [File Retention = \lt{}time-period-specification\gt{}]
\index[dir]{File Retention }
- The File Retention directive defines the length of time that Bacula will keep
+ The File Retention directive defines the length of time that Bacula will
+keep
File records in the Catalog database. When this time period expires, and if
{\bf AutoPrune} is set to {\bf yes} Bacula will prune (remove) File records
that are older than the specified File Retention period. Note, this affects
This directive specifies the name of the device to be used for the
storage. This name is not the physical device name, but the logical device
name as defined on the {\bf Name} directive contained in the {\bf Device}
-resource definition of the {\bf Storage daemon} configuration file. You can
-specify any name you would like (even the device name if you prefer) up to a
+resource definition of the {\bf Storage daemon} configuration file or if
+the device is an Autochanger, you must put the name as defined on the {\bf Name}
+directive contained in the {\bf Autochanger resource definition of the {\bf
+Storage daemon}. You can specify any name you would like (even the device name
+if you prefer) up to a
maximum of 127 characters in length. The physical device name associated with
this device is specified in the {\bf Storage daemon} configuration file (as
{\bf Archive Device}). Please take care not to define two different Storage
\item [Media Type = \lt{}MediaType\gt{}]
\index[dir]{Media Type }
- This directive specifies the Media Type to be used to store the data. This is
+ This directive specifies the Media Type to be used to store the data. This
+is
an arbitrary string of characters up to 127 maximum that you define. It can
be anything you want. However, it is best to make it descriptive of the
storage media (e.g. File, DAT, ''HP DLT8000``, 8mm, ...). In addition, it is
\label{Autochanger1}
\item [Autochanger = \lt{}yes|no\gt{}]
\index[dir]{Autochanger }
- If you specify {\bf yes} for this command (the default is {\bf no}), when you
+ If you specify {\bf yes} for this command (the default is {\bf no}), when
+you
use the {\bf label} command or the {\bf add} command to create a new Volume,
{\bf Bacula} will also request the Autochanger Slot number. This simplifies
creating database entries for Volumes in an autochanger. If you forget to
\item [Maximum Concurrent Jobs = \lt{}number\gt{}]
\index[dir]{Maximum Concurrent Jobs }
- where \lt{}number\gt{} is the maximum number of Jobs with the current Storage
+ where \lt{}number\gt{} is the maximum number of Jobs with the current
+Storage
resource that can run concurrently. Note, this directive limits only Jobs
for Jobs using this Storage daemon. Any other restrictions on the maximum
concurrent jobs such as in the Director, Job, or Client resources will also
\item [Pool]
\index[dir]{Pool }
- Start of the Pool resource. There must be at least one Pool resource defined.
+ Start of the Pool resource. There must be at least one Pool resource
+defined.
\item [Name = \lt{}name\gt{}]
\item [Volume Use Duration = \lt{}time-period-specification\gt{}]
\index[dir]{Volume Use Duration }
- The Volume Use Duration directive defines the time period that the Volume can
+ The Volume Use Duration directive defines the time period that the Volume
+can
be written beginning from the time of first data write to the Volume. If the
time-period specified is zero (the default), the Volume can be written
indefinitely. Otherwise, when the time period from the first write to the
You might use this directive, for example, if you have a Volume used for
Incremental backups, and Volumes used for Weekly Full backups. Once the Full
- backup is done, you will want to use a different Incremental Volume. This can
- be accomplished by setting the Volume Use Duration for the Incremental Volume
- to six days. I.e. it will be used for the 6 days following a Full save, then
+ backup is done, you will want to use a different Incremental Volume. This
+can
+ be accomplished by setting the Volume Use Duration for the Incremental
+Volume
+ to six days. I.e. it will be used for the 6 days following a Full save,
+then
a different Incremental volume will be used. Be careful about setting the
duration to short periods such as 23 hours, or you might experience problems
of Bacula waiting for a tape over the weekend only to complete the backups
Monday morning when an operator mounts a new tape.
- The use duration is checked and the {\bf Used} status is set only at the end of a
+ The use duration is checked and the {\bf Used} status is set only at the end
+of a
job that writes to the particular volume, which means that even though the
use duration may have expired, the catalog entry will not be updated until
the next job that uses this volume is run.
This directive defines whether or not you want the names of the files that
were saved to be put into the catalog. The default is {\bf yes}. The
advantage of specifying {\bf Catalog Files = No} is that you will have a
- significantly smaller Catalog database. The disadvantage is that you will not
+ significantly smaller Catalog database. The disadvantage is that you will
+not
be able to produce a Catalog listing of the files backed up for each Job
(this is often called Browsing). Also, without the File entries in the
catalog, you will not be able to use the Console {\bf restore} command nor
Volume Retention period. All File records associated with pruned Jobs are
also pruned. The time may be specified as seconds, minutes, hours, days,
weeks, months, quarters, or years. The {\bf Volume Retention} applied
- independently to the {\bf Job Retention} and the {\bf File Retention} periods
+ independently to the {\bf Job Retention} and the {\bf File Retention}
+periods
defined in the Client resource. This means that the shorter period is the
one that applies. Note, that when the {\bf Volume Retention} period has been
reached, it will prune both the Job and the File records.
- The default is 365 days. Note, this directive sets the default value for each
+ The default is 365 days. Note, this directive sets the default value for
+each
Volume entry in the Catalog when the Volume is created. The value in the
catalog may be later individually changed for each Volume using the Console
program.
By defining multiple Pools with different Volume Retention periods, you may
effectively have a set of tapes that is recycled weekly, another Pool of
- tapes that is recycled monthly and so on. However, one must keep in mind that
+ tapes that is recycled monthly and so on. However, one must keep in mind
+that
if your {\bf Volume Retention} period is too short, it may prune the last
- valid Full backup, and hence until the next Full backup is done, you will not
- have a complete backup of your system, and in addition, the next Incremental
+ valid Full backup, and hence until the next Full backup is done, you will
+not
+ have a complete backup of your system, and in addition, the next
+Incremental
or Differential backup will be promoted to a Full backup. As a consequence,
the minimum {\bf Volume Retention} period should be at twice the interval of
your Full backups. This means that if you do a Full backup once a month, the
\item [Accept Any Volume = \lt{}yes|no\gt{}]
\index[dir]{Accept Any Volume }
- This directive specifies whether or not any volume from the Pool may be used
+ This directive specifies whether or not any volume from the Pool may be
+used
for backup. The default is {\bf yes} as of version 1.27 and later. If it is
{\bf no} then only the first writable volume in the Pool will be accepted for
writing backup data, thus Bacula will fill each Volume sequentially in turn
\item [Cleaning Prefix = \lt{}string\gt{}]
\index[dir]{Cleaning Prefix }
- This directive defines a prefix string, which if it matches the beginning of
+ This directive defines a prefix string, which if it matches the beginning
+of
a Volume name during labeling of a Volume, the Volume will be defined with
the VolStatus set to {\bf Cleaning} and thus Bacula will never attempt to use
this tape. This is primarily for use with autochangers that accept barcodes
\item [Label Format = \lt{}format\gt{}]
\index[dir]{Label Format }
- This directive specifies the format of the labels contained in this pool. The
+ This directive specifies the format of the labels contained in this pool.
+The
format directive is used as a sort of template to create new Volume names
during automatic Volume labeling.
\item [Catalog]
\index[dir]{Catalog }
- Start of the Catalog resource. At least one Catalog resource must be defined.
+ Start of the Catalog resource. At least one Catalog resource must be
+defined.
\item [Name = \lt{}name\gt{}]
\item [user = \lt{}user\gt{}]
\index[dir]{user }
- This specifies what user name to use to log into the database. This directive
+ This specifies what user name to use to log into the database. This
+directive
is required.
\item [DB Socket = \lt{}socket-name\gt{}]
%% \item [Multiple Connections = \lt{}yes|no\gt{}]
%% \index[dir]{Multiple Connections }
-%% By default, this directive is set to no. In that case, each job that uses the
+%% By default, this directive is set to no. In that case, each job that uses
+the
%% same Catalog will use a single connection to the catalog. It will be shared,
%% and Bacula will allow only one Job at a time to communicate. If you set this
%% directive to yes, Bacula will permit multiple connections to the database,
%% this is no problem. For MySQL, you must be *very* careful to have the
%% multi-thread version of the client library loaded on your system. When this
%% directive is set yes, each Job will have a separate connection to the
-%% database, and the database will control the interaction between the different
+%% database, and the database will control the interaction between the
+different
%% Jobs. This can significantly speed up the database operations if you are
%% running multiple simultaneous jobs. In addition, for SQLite and PostgreSQL,
%% Bacula will automatically enable transactions. This can significantly speed
\item [Password = \lt{}password\gt{}]
\index[dir]{Password }
- Specifies the password that must be supplied for a named Bacula Console to be
+ Specifies the password that must be supplied for a named Bacula Console to
+be
authorized. The same password must appear in the {\bf Console} resource of
the Console configuration file. For added security, the password is never
actually passed across the network but rather a challenge response hash code
\item [ClientACL = \lt{}name-list\gt{}]
\index[dir]{ClientACL }
- This directive is used to specify a list of Client resource names that can be
+ This directive is used to specify a list of Client resource names that can
+be
accessed by the console.
\item [StorageACL = \lt{}name-list\gt{}]
\item [*WrapCounter = \lt{}counter-name\gt{}]
\index[dir]{*WrapCounter }
- If this value is specified, when the counter is incremented past the maximum
+ If this value is specified, when the counter is incremented past the
+maximum
and thus reset to the minimum, the counter specified on the {\bf WrapCounter}
is incremented. (This is not currently implemented).