1BP_BULK_LOAD_GFF(1) User Contributed Perl Documentation BP_BULK_LOAD_GFF(1)
2
3
4
6 bulk_load_gff.pl - Bulk-load a Bio::DB::GFF database from GFF files.
7
9 % bulk_load_gff.pl -d testdb dna1.fa dna2.fa features1.gff features2.gff ...
10
12 This script loads a Bio::DB::GFF database with the features contained
13 in a list of GFF files and/or FASTA sequence files. You must use the
14 exact variant of GFF described in Bio::DB::GFF. Various command-line
15 options allow you to control which database to load and whether to
16 allow an existing database to be overwritten.
17
18 This script differs from bp_load_gff.pl in that it is hard-coded to use
19 MySQL and cannot perform incremental loads. See bp_load_gff.pl for an
20 incremental loader that works with all databases supported by
21 Bio::DB::GFF, and bp_fast_load_gff.pl for a MySQL loader that supports
22 fast incremental loads.
23
24 NOTES
25
26 If the filename is given as "-" then the input is taken from standard
27 input. Compressed files (.gz, .Z, .bz2) are automatically uncompressed.
28
29 FASTA format files are distinguished from GFF files by their filename
30 extensions. Files ending in .fa, .fasta, .fast, .seq, .dna and their
31 uppercase variants are treated as FASTA files. Everything else is
32 treated as a GFF file. If you wish to load -fasta files from STDIN,
33 then use the -f command-line swith with an argument of '-', as in
34
35 gunzip my_data.fa.gz ⎪ bp_fast_load_gff.pl -d test -f -
36
37 The nature of the bulk load requires that the database be on the local
38 machine and that the indicated user have the "file" privilege to load
39 the tables and have enough room in /usr/tmp (or whatever is specified
40 by the \$TMPDIR environment variable), to hold the tables transiently.
41
42 Local data may now be uploaded to a remote server via the --local
43 option with the database host specified in the dsn, e.g.
44 dbi:mysql:test:db_host
45
46 The adaptor used is dbi::mysqlopt. There is currently no way to change
47 this.
48
49 About maxfeature: the default value is 100,000,000 bases. If you have
50 features that are close to or greater that 100Mb in length, then the
51 value of maxfeature should be increased to 1,000,000,000. This value
52 must be a power of 10.
53
54 Note that Windows users must use the --create option.
55
56 If the list of GFF or fasta files exceeds the kernel limit for the max‐
57 imum number of command-line arguments, use the --long_list
58 /path/to/files option.
59
61 Command-line options can be abbreviated to single-letter options. e.g.
62 -d instead of --database.
63
64 --database <dsn> Database name (default dbi:mysql:test)
65 --adaptor Adaptor name (default mysql)
66 --create Reinitialize/create data tables without asking
67 --user Username to log in as
68 --fasta File or directory containing fasta files to load
69 --long_list Directory containing a very large number of
70 GFF and/or FASTA files
71 --password Password to use for authentication
72 (Doesn't work with Postgres, password must be
73 supplied interactively)
74 --maxbin Set the value of the maximum bin size
75 --local Flag to indicate that the data source is local
76 --maxfeature Set the value of the maximum feature size (power of 10)
77 --group A list of one or more tag names (comma or space separated)
78 to be used for grouping in the 9th column.
79 --gff3_munge Activate GFF3 name munging (see Bio::DB::GFF)
80 --Temporary Location of a writable scratch directory
81
83 Bio::DB::GFF, fast_load_gff.pl, load_gff.pl
84
86 Lincoln Stein, lstein@cshl.org
87
88 Copyright (c) 2002 Cold Spring Harbor Laboratory
89
90 This library is free software; you can redistribute it and/or modify it
91 under the same terms as Perl itself. See DISCLAIMER.txt for dis‐
92 claimers of warranty.
93
94
95
96perl v5.8.8 2007-05-07 BP_BULK_LOAD_GFF(1)