mirror of
https://github.com/gryf/pygtktalog.git
synced 2026-03-26 22:03:30 +01:00
Compare commits
20 Commits
pyGTKtalog
...
85ab034a36
| Author | SHA1 | Date | |
|---|---|---|---|
| 85ab034a36 | |||
| 4b02641481 | |||
| 51e3bfa441 | |||
| c74174fc8f | |||
| 54c24b18b1 | |||
| 7281f9bbbb | |||
| 002ff724ea | |||
| cd1482e4a1 | |||
| 028571e9c1 | |||
| b284f328b3 | |||
| 01fd964e0d | |||
| c257d6ceeb | |||
| 28499868d2 | |||
| 5f13fd7d7a | |||
| a1a17158bb | |||
| 10e7e87031 | |||
| 3141add678 | |||
| dadeebe8a1 | |||
| 6c6f01781a | |||
| 07690f9c94 |
2
.gitignore
vendored
2
.gitignore
vendored
@@ -5,4 +5,4 @@ __pycache__/
|
|||||||
tags
|
tags
|
||||||
MANIFEST
|
MANIFEST
|
||||||
.cache
|
.cache
|
||||||
pygtktalog.egg-info
|
pycatalog.egg-info
|
||||||
|
|||||||
@@ -1,4 +0,0 @@
|
|||||||
include setup.py
|
|
||||||
include pavement.py
|
|
||||||
include paver-minilib.zip
|
|
||||||
include pygtktalog/locale/*/*/*.mo
|
|
||||||
118
README.rst
118
README.rst
@@ -1,132 +1,75 @@
|
|||||||
pyGTKtalog
|
pycatalog
|
||||||
==========
|
==========
|
||||||
|
|
||||||
Pygtktalog is Linux/FreeBSD program for indexing CD, DVD, BR or directories on
|
Pycatalog is a commandline Linux/FreeBSD program for indexing CD, DVD, BR or
|
||||||
filesystem. It is similar to `gtktalog`_ or `gwhere`_. There is no coincidence
|
directories on filesystem. It is similar to `gtktalog`_ or `gwhere`_. There is
|
||||||
in name of application, because it's meant to be replacement (in some way) for
|
no coincidence in name of application, because it's meant to be replacement
|
||||||
gtktalog, which seems to be dead project for years.
|
(in some way) for gtktalog, which seems to be dead project for years.
|
||||||
|
|
||||||
Current version is 2.0.
|
Note, that even if it share same code base with pyGTKtalog, which was meant to
|
||||||
|
be desktop application, now pycatalog is pure console app, just for use with
|
||||||
|
commandline. You can find last version of pyGTKtalog under ``pyGTKtalog``
|
||||||
|
branch, although bear in mind, that it was written with `python 2.7`_ and
|
||||||
|
pyGTK_, which both are dead now.
|
||||||
|
|
||||||
FEATURES
|
Current version is 3.0.
|
||||||
|
|
||||||
|
Features
|
||||||
--------
|
--------
|
||||||
|
|
||||||
* Scan for files in selected media
|
* Scan for files in selected media
|
||||||
* Support for grouping files depending on file name (expected patterns in file
|
* Support for grouping files depending on file name (expected patterns in file
|
||||||
names)
|
names)
|
||||||
* Get/generate thumbnails from EXIF and other images
|
|
||||||
* Store selected EXIF tags
|
* Store selected EXIF tags
|
||||||
* Add/edit description and notes
|
* Add/edit description and notes
|
||||||
* Fetch comments for images made in `gThumb`_
|
* Fetch comments for images made in `gThumb`_
|
||||||
* Add/remove unlimited images to any file or directory
|
|
||||||
* `Tagging files`_
|
* `Tagging files`_
|
||||||
* And more :)
|
* And more :)
|
||||||
|
|
||||||
Frontends
|
|
||||||
---------
|
|
||||||
|
|
||||||
New version of pyGTKtalog was meant to use multiple interfaces.
|
|
||||||
|
|
||||||
#. First for the new incarnation of pyGTKtalog is… command line tool for
|
|
||||||
accessing catalog dbs. With ``cmdcatalog.py`` it's possible to:
|
|
||||||
|
|
||||||
* create new catalog
|
|
||||||
* update it
|
|
||||||
* list
|
|
||||||
* find files
|
|
||||||
* fsck (for maintenance for orphaned thumbs/images)
|
|
||||||
|
|
||||||
#. ``gtktalog.py``. This is written from scratch frontend in pygtk. Still work
|
|
||||||
in progress.
|
|
||||||
|
|
||||||
Requirements
|
Requirements
|
||||||
------------
|
------------
|
||||||
|
|
||||||
pyGTKtalog requires python and following libraries:
|
pycatalog requires python and following libraries:
|
||||||
|
|
||||||
* `python 2.7`_
|
* `python 3.10`_ and up
|
||||||
* `sqlalchemy 1.0`_
|
* `sqlalchemy 1.4`_
|
||||||
* `pygtk 2.24`_ (only for ``gtktalog.py``)
|
|
||||||
* `pillow`_ for image manipulation
|
|
||||||
* `exifread`_ for parse EXIF information
|
* `exifread`_ for parse EXIF information
|
||||||
|
* `mutagen`_ for extracting tags from audio files
|
||||||
|
|
||||||
It may work on other (lower) version of libraries, and it should work with
|
Pycatalog extensively uses external programs in unix spirit, however there is
|
||||||
higher versions of libraries, although it will not work on Python 3 yet, nor
|
|
||||||
GTK3.
|
|
||||||
|
|
||||||
pyGTKtalog extensively uses external programs in unix spirit, however there is
|
|
||||||
small possibility of using it Windows (probably with limitations) and quite big
|
small possibility of using it Windows (probably with limitations) and quite big
|
||||||
possibility to run it on other sophisticated unix-like systems (i.e.
|
possibility to run it on other sophisticated unix-like systems (i.e.
|
||||||
BeOS/ZETA/Haiku, QNX or MacOSX).
|
BeOS/ZETA/Haiku, QNX or MacOSX).
|
||||||
|
|
||||||
Programs that are used:
|
Programs that are used:
|
||||||
* ``mencoder`` (provided by `mplayer`_ package)
|
* ``midentify`` (provided by `mplayer`_ package)
|
||||||
* ``montage``, ``convert`` from `ImageMagick`_
|
|
||||||
|
|
||||||
For development process following programs are used:
|
For development process following programs are used:
|
||||||
|
|
||||||
* `gettext`_
|
|
||||||
* `intltool`_
|
|
||||||
* `nose`_
|
* `nose`_
|
||||||
* `coverage`_
|
* `coverage`_
|
||||||
* `paver`_
|
|
||||||
* `tox`_
|
* `tox`_
|
||||||
|
|
||||||
INSTALATION
|
Instalation
|
||||||
-----------
|
-----------
|
||||||
|
|
||||||
You don't have to install it if you don't want to. You can just change current
|
You don't have to install it if you don't want to. You can just change current
|
||||||
directory to pyGTKtalog and simply run::
|
directory to pycatalog and simply run::
|
||||||
|
|
||||||
$ paver run
|
$ paver run
|
||||||
|
|
||||||
That's it. Alternatively, if you like to put it in more system wide place, all
|
That's it. Alternatively, if you like to put it in more system wide place, all
|
||||||
you have to do is:
|
you have to do is:
|
||||||
|
|
||||||
#. put pyGTKtalog directory into your destination of choice (/usr/local/share,
|
#. put pycatalog directory into your destination of choice (/usr/local/share,
|
||||||
/opt or ~/ is typical bet)
|
/opt or ~/ is typical bet)
|
||||||
|
|
||||||
#. copy pyGTKtalog shell script to /usr/bin, /usr/local/bin or in
|
#. copy pycatalog shell script to /usr/bin, /usr/local/bin or in
|
||||||
other place, where PATH variable is pointing or you feel like.
|
other place, where PATH variable is pointing or you feel like.
|
||||||
|
|
||||||
#. then modify pyGTKtalog line 6 to match right ``pygtktalog.py`` directory
|
#. then modify pycatalog line 6 to match right ``pycatalog.py`` directory
|
||||||
|
|
||||||
Then, just run pyGTKtalog script.
|
Then, just run pycatalog script.
|
||||||
|
|
||||||
Technical details
|
|
||||||
-----------------
|
|
||||||
|
|
||||||
Catalog file is plain sqlite database (optionally compressed with bzip2). All
|
|
||||||
images are stored in location pointed by db entry in ``config`` table - it is
|
|
||||||
assumed, that images directory will be placed within the root directory, where
|
|
||||||
the main db lies.
|
|
||||||
Generated sha512 hash from image file itself. There is small possibility for two
|
|
||||||
identical hash for different image files. However, no images are overwritten.
|
|
||||||
Thumbnail filename for each image is simply concatenation of image filename in
|
|
||||||
images directory and '_t' string.
|
|
||||||
|
|
||||||
There is also converter from old database to new for internal use only. In
|
|
||||||
public release there will be no other formats so it will be useless, and
|
|
||||||
deleted. There are some issues with converting. All thumbnails will be lost.
|
|
||||||
All images without big image will be lost. There are serious changes with
|
|
||||||
application design, and I decided, that is better to keep media unpacked on
|
|
||||||
disk, instead of pack it every time with save and unpack with open methods. New
|
|
||||||
design prevent from deleting any file from media directory (placed in
|
|
||||||
``~/.pygtktalog/images``). Functionality for exporting images and corresponding
|
|
||||||
db file is planned.
|
|
||||||
|
|
||||||
|
|
||||||
DEVELOPMENT
|
|
||||||
-----------
|
|
||||||
|
|
||||||
Several tools has been used to develop pyGTKtalog.
|
|
||||||
|
|
||||||
Paver
|
|
||||||
^^^^^
|
|
||||||
|
|
||||||
I've choose `Paver`_ as make equivalent. Inside main project directory there is
|
|
||||||
``pavement.py`` script, which provides several tasks, that can be helpful in a work
|
|
||||||
with sources. Paver is also used to generate standard ``setup.py``.
|
|
||||||
|
|
||||||
LICENSE
|
LICENSE
|
||||||
=======
|
=======
|
||||||
@@ -137,18 +80,13 @@ file in top-level directory.
|
|||||||
|
|
||||||
.. _coverage: http://nedbatchelder.com/code/coverage/
|
.. _coverage: http://nedbatchelder.com/code/coverage/
|
||||||
.. _exifread: https://github.com/ianare/exif-py
|
.. _exifread: https://github.com/ianare/exif-py
|
||||||
.. _gettext: http://www.gnu.org/software/gettext/gettext.html
|
|
||||||
.. _gthumb: http://gthumb.sourceforge.net
|
.. _gthumb: http://gthumb.sourceforge.net
|
||||||
.. _gtktalog: http://www.nongnu.org/gtktalog/
|
.. _gtktalog: http://www.nongnu.org/gtktalog/
|
||||||
.. _gwhere: http://www.gwhere.org/home.php3
|
.. _gwhere: http://www.gwhere.org/home.php3
|
||||||
.. _imagemagick: http://imagemagick.org/script/index.php
|
|
||||||
.. _intltool: http://www.gnome.org/
|
|
||||||
.. _mplayer: http://mplayerhq.hu
|
.. _mplayer: http://mplayerhq.hu
|
||||||
.. _nose: http://code.google.com/p/python-nose/
|
.. _nose: http://code.google.com/p/python-nose/
|
||||||
.. _paver: https://pythonhosted.org/paver/
|
.. _python 3.10: http://www.python.org/
|
||||||
.. _pillow: https://python-pillow.org/
|
.. _sqlalchemy 1.4: http://www.sqlalchemy.org
|
||||||
.. _pygtk 2.24: http://www.pygtk.org
|
|
||||||
.. _python 2.7: http://www.python.org/
|
|
||||||
.. _sqlalchemy 1.0: http://www.sqlalchemy.org
|
|
||||||
.. _tagging files: http://en.wikipedia.org/wiki/tag_%28metadata%29
|
.. _tagging files: http://en.wikipedia.org/wiki/tag_%28metadata%29
|
||||||
.. _tox: https://testrun.org/tox
|
.. _tox: https://testrun.org/tox
|
||||||
|
.. _mutagen: https://github.com/quodlibet/mutagen
|
||||||
|
|||||||
228
pavement.py
228
pavement.py
@@ -1,228 +0,0 @@
|
|||||||
"""
|
|
||||||
Project: pyGTKtalog
|
|
||||||
Description: Makefile and setup.py replacement. Used python packages -
|
|
||||||
paver, nosetests. External commands - xgettext, intltool-extract, hg,
|
|
||||||
grep.
|
|
||||||
Type: management
|
|
||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
|
||||||
Created: 2009-05-07
|
|
||||||
"""
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
import shutil
|
|
||||||
from datetime import datetime
|
|
||||||
|
|
||||||
from paver.easy import sh, dry, call_task, options, Bunch
|
|
||||||
from paver.tasks import task, needs, help, cmdopts
|
|
||||||
from paver.setuputils import setup
|
|
||||||
from paver.misctasks import generate_setup, minilib
|
|
||||||
import paver.doctools
|
|
||||||
|
|
||||||
try:
|
|
||||||
from pylint import lint
|
|
||||||
HAVE_LINT = True
|
|
||||||
except ImportError:
|
|
||||||
HAVE_LINT = False
|
|
||||||
|
|
||||||
PO_HEADER = """#
|
|
||||||
# pygtktalog Language File
|
|
||||||
#
|
|
||||||
msgid ""
|
|
||||||
msgstr ""
|
|
||||||
"Project-Id-Version: pygtktalog\\n"
|
|
||||||
"POT-Creation-Date: %(time)s\\n"
|
|
||||||
"Last-Translator: Roman Dobosz<gryf73@gmail.com>\\n"
|
|
||||||
"MIME-Version: 1.0\\n"
|
|
||||||
"Content-Type: text/plain; charset=UTF-8\\n"
|
|
||||||
"Content-Transfer-Encoding: utf-8\\n"
|
|
||||||
"""
|
|
||||||
|
|
||||||
REV = os.popen("hg sum 2>/dev/null|grep ^Revis|cut -d ' ' -f 2").readlines()
|
|
||||||
if REV:
|
|
||||||
REV = "r" + REV[0].strip()
|
|
||||||
else:
|
|
||||||
REV = '0'
|
|
||||||
|
|
||||||
LOCALES = {'pl': 'pl_PL.utf8', 'en': 'en_EN'}
|
|
||||||
POTFILE = 'locale/pygtktalog.pot'
|
|
||||||
|
|
||||||
|
|
||||||
# distutil/setuptool setup method.
|
|
||||||
setup(
|
|
||||||
name='pyGTKtalog',
|
|
||||||
version='1.99.%s' % REV,
|
|
||||||
|
|
||||||
long_description='pyGTKtalog is application similar to WhereIsIT, '
|
|
||||||
'for collecting information on files from CD/DVD '
|
|
||||||
'or directories.',
|
|
||||||
description='pyGTKtalog is a file indexing tool written in pyGTK',
|
|
||||||
author='Roman Dobosz',
|
|
||||||
author_email='gryf73@gmail.com',
|
|
||||||
url='http://google.com',
|
|
||||||
platforms=['Linux', 'BSD'],
|
|
||||||
license='GNU General Public License (GPL)',
|
|
||||||
classifiers=['Development Status :: 2 - Pre-Alpha',
|
|
||||||
'Environment :: X11 Applications :: GTK',
|
|
||||||
'Intended Audience :: End Users/Desktop',
|
|
||||||
'License :: OSI Approved :: GNU General Public License '
|
|
||||||
'(GPL)',
|
|
||||||
'Natural Language :: English',
|
|
||||||
'Natural Language :: Polish',
|
|
||||||
'Operating System :: POSIX :: Linux',
|
|
||||||
'Programming Language :: Python',
|
|
||||||
'Topic :: Desktop Environment',
|
|
||||||
'Topic :: Utilities'],
|
|
||||||
|
|
||||||
include_package_data=True,
|
|
||||||
exclude_package_data={'': ['*.patch']},
|
|
||||||
packages=["pygtktalog"],
|
|
||||||
scripts=['bin/gtktalog.py'],
|
|
||||||
test_suite='nose.collector'
|
|
||||||
)
|
|
||||||
|
|
||||||
options(sphinx=Bunch(builddir="build", sourcedir="source"))
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@needs(['locale_gen', 'minilib', 'generate_setup'])
|
|
||||||
def sdist():
|
|
||||||
"""sdist with message catalogs"""
|
|
||||||
call_task("setuptools.command.sdist")
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@needs(['locale_gen'])
|
|
||||||
def build():
|
|
||||||
"""build with message catalogs"""
|
|
||||||
call_task("setuptools.command.build")
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
def clean():
|
|
||||||
"""remove 'pyo', 'pyc', 'h' and '~' files"""
|
|
||||||
# clean *.pyc, *.pyo and jEdit backup files *~
|
|
||||||
for root, dummy, files in os.walk("."):
|
|
||||||
for fname in files:
|
|
||||||
if fname.endswith(".pyc") or fname.endswith(".pyo") or \
|
|
||||||
fname.endswith("~") or fname.endswith(".h") or \
|
|
||||||
fname == '.coverage':
|
|
||||||
fdel = os.path.join(root, fname)
|
|
||||||
os.unlink(fdel)
|
|
||||||
print "deleted", fdel
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@needs(["clean"])
|
|
||||||
def distclean():
|
|
||||||
"""make clean, and remove any dist/build/egg stuff from project"""
|
|
||||||
for dirname in [os.path.join('pygtktalog', 'locale'), 'build', 'dist',
|
|
||||||
'pyGTKtalog.egg-info']:
|
|
||||||
if os.path.exists(dirname):
|
|
||||||
shutil.rmtree(dirname, ignore_errors=True)
|
|
||||||
print "removed directory", dirname
|
|
||||||
|
|
||||||
for filename in ['paver-minilib.zip', 'setup.py', 'tests/.coverage']:
|
|
||||||
if os.path.exists(filename):
|
|
||||||
os.unlink(filename)
|
|
||||||
print "deleted", filename
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
def run():
|
|
||||||
"""run application"""
|
|
||||||
sh("PYTHONPATH=%s:$PYTHONPATH bin/gtktalog.py" % _setup_env())
|
|
||||||
#import gtktalog
|
|
||||||
#gtktalog.run()
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
def pot():
|
|
||||||
"""generate 'pot' file out of python/glade files"""
|
|
||||||
if not os.path.exists('locale'):
|
|
||||||
os.mkdir('locale')
|
|
||||||
|
|
||||||
if not os.path.exists(POTFILE):
|
|
||||||
fname = open(POTFILE, "w")
|
|
||||||
fname.write(PO_HEADER % {'time': datetime.now()})
|
|
||||||
|
|
||||||
cmd = "xgettext --omit-header -k_ -kN_ -j -o %s %s"
|
|
||||||
cmd_glade = "intltool-extract --type=gettext/glade %s"
|
|
||||||
|
|
||||||
for walk_tuple in os.walk("pygtktalog"):
|
|
||||||
root = walk_tuple[0]
|
|
||||||
for fname in walk_tuple[2]:
|
|
||||||
if fname.endswith(".py"):
|
|
||||||
sh(cmd % (POTFILE, os.path.join(root, fname)))
|
|
||||||
elif fname.endswith(".glade"):
|
|
||||||
sh(cmd_glade % os.path.join(root, fname))
|
|
||||||
sh(cmd % (POTFILE, os.path.join(root, fname + ".h")))
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@needs(['pot'])
|
|
||||||
def locale_merge():
|
|
||||||
"""create or merge if exists 'po' translation files"""
|
|
||||||
potfile = os.path.join('locale', 'pygtktalog.pot')
|
|
||||||
|
|
||||||
for lang in LOCALES:
|
|
||||||
msg_catalog = os.path.join('locale', "%s.po" % lang)
|
|
||||||
if os.path.exists(msg_catalog):
|
|
||||||
sh('msgmerge -U %s %s' % (msg_catalog, potfile))
|
|
||||||
else:
|
|
||||||
shutil.copy(potfile, msg_catalog)
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@needs(['locale_merge'])
|
|
||||||
def locale_gen():
|
|
||||||
"""generate message catalog file for available locale files"""
|
|
||||||
full_path = os.path.join('pygtktalog', 'locale')
|
|
||||||
if not os.path.exists(full_path):
|
|
||||||
os.mkdir(full_path)
|
|
||||||
|
|
||||||
for lang in LOCALES:
|
|
||||||
lang_path = full_path
|
|
||||||
for dirname in [lang, 'LC_MESSAGES']:
|
|
||||||
lang_path = os.path.join(lang_path, dirname)
|
|
||||||
if not os.path.exists(lang_path):
|
|
||||||
os.mkdir(lang_path)
|
|
||||||
catalog_file = os.path.join(lang_path, 'pygtktalog.mo')
|
|
||||||
msg_catalog = os.path.join('locale', "%s.po" % lang)
|
|
||||||
sh('msgfmt %s -o %s' % (msg_catalog, catalog_file))
|
|
||||||
|
|
||||||
|
|
||||||
if HAVE_LINT:
|
|
||||||
@task
|
|
||||||
def pylint():
|
|
||||||
'''Check the module you're building with pylint.'''
|
|
||||||
pylintopts = ['pygtktalog']
|
|
||||||
dry('pylint %s' % (" ".join(pylintopts)), lint.Run, pylintopts)
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@cmdopts([('coverage', 'c', 'display coverage information')])
|
|
||||||
def test(options):
|
|
||||||
"""run unit tests"""
|
|
||||||
cmd = "PYTHONPATH=%s:$PYTHONPATH nosetests -w tests" % _setup_env()
|
|
||||||
if hasattr(options.test, 'coverage'):
|
|
||||||
cmd += " --with-coverage --cover-package pygtktalog"
|
|
||||||
os.system(cmd)
|
|
||||||
|
|
||||||
|
|
||||||
@task
|
|
||||||
@needs(['locale_gen'])
|
|
||||||
def runpl():
|
|
||||||
"""run application with pl_PL localization. This is just for my
|
|
||||||
convenience"""
|
|
||||||
os.environ['LC_ALL'] = 'pl_PL.utf8'
|
|
||||||
run()
|
|
||||||
|
|
||||||
|
|
||||||
def _setup_env():
|
|
||||||
"""Helper function to set up paths"""
|
|
||||||
# current directory
|
|
||||||
this_path = os.path.dirname(os.path.abspath(__file__))
|
|
||||||
if this_path not in sys.path:
|
|
||||||
sys.path.insert(0, this_path)
|
|
||||||
|
|
||||||
return this_path
|
|
||||||
295
scripts/cmdcatalog.py → pycatalog/__init__.py
Executable file → Normal file
295
scripts/cmdcatalog.py → pycatalog/__init__.py
Executable file → Normal file
@@ -1,63 +1,32 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
"""
|
"""
|
||||||
Fast and ugly CLI interface for pyGTKtalog
|
Fast and ugly CLI interface
|
||||||
"""
|
"""
|
||||||
import argparse
|
import argparse
|
||||||
import errno
|
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import sys
|
|
||||||
|
|
||||||
from sqlalchemy import or_
|
from sqlalchemy import or_
|
||||||
|
|
||||||
from pygtktalog import scan
|
from pycatalog import scan
|
||||||
from pygtktalog import misc
|
from pycatalog import misc
|
||||||
from pygtktalog import dbobjects as dbo
|
from pycatalog import dbobjects as dbo
|
||||||
from pygtktalog.dbcommon import connect, Session
|
from pycatalog import dbcommon
|
||||||
from pygtktalog import logger
|
from pycatalog import logger
|
||||||
|
|
||||||
BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = range(30, 38)
|
|
||||||
|
|
||||||
RESET_SEQ = '\033[0m'
|
|
||||||
COLOR_SEQ = '\033[1;%dm'
|
|
||||||
BOLD_SEQ = '\033[1m'
|
|
||||||
|
|
||||||
LOG = logger.get_logger(__name__)
|
|
||||||
|
|
||||||
def colorize(txt, color):
|
|
||||||
"""Pretty print with colors to console."""
|
|
||||||
color_map = {'black': BLACK,
|
|
||||||
'red': RED,
|
|
||||||
'green': GREEN,
|
|
||||||
'yellow': YELLOW,
|
|
||||||
'blue': BLUE,
|
|
||||||
'magenta': MAGENTA,
|
|
||||||
'cyan': CYAN,
|
|
||||||
'white': WHITE}
|
|
||||||
return COLOR_SEQ % color_map[color] + txt + RESET_SEQ
|
|
||||||
|
|
||||||
|
|
||||||
def asserdb(func):
|
|
||||||
def wrapper(args):
|
|
||||||
if not os.path.exists(args.db):
|
|
||||||
print colorize("File `%s' does not exists!" % args.db, 'red')
|
|
||||||
sys.exit(1)
|
|
||||||
func(args)
|
|
||||||
return wrapper
|
|
||||||
|
|
||||||
|
|
||||||
|
LOG = logger.get_logger()
|
||||||
TYPE_MAP = {0: 'd', 1: 'd', 2: 'f', 3: 'l'}
|
TYPE_MAP = {0: 'd', 1: 'd', 2: 'f', 3: 'l'}
|
||||||
|
|
||||||
|
|
||||||
class Iface(object):
|
class Iface(object):
|
||||||
"""Main class which interacts with the pyGTKtalog modules"""
|
"""Main class which interacts with the pyGTKtalog modules"""
|
||||||
def __init__(self, dbname, pretend=False, debug=False):
|
def __init__(self, dbname, pretend=False, debug=False, use_color=True):
|
||||||
"""Init"""
|
"""Init"""
|
||||||
self.engine = connect(dbname)
|
self.engine = dbcommon.connect(dbname)
|
||||||
self.sess = Session()
|
self.sess = dbcommon.Session()
|
||||||
self.dry_run = pretend
|
self.dry_run = pretend
|
||||||
self.root = None
|
self.root = None
|
||||||
self._dbname = dbname
|
self._dbname = dbname
|
||||||
|
self.use_color = use_color
|
||||||
if debug:
|
if debug:
|
||||||
scan.LOG.setLevel('DEBUG')
|
scan.LOG.setLevel('DEBUG')
|
||||||
LOG.setLevel('DEBUG')
|
LOG.setLevel('DEBUG')
|
||||||
@@ -96,11 +65,11 @@ class Iface(object):
|
|||||||
"""Make the path to the item in the DB"""
|
"""Make the path to the item in the DB"""
|
||||||
orig_node = node
|
orig_node = node
|
||||||
if node.parent == node:
|
if node.parent == node:
|
||||||
return {u'/': (u' ', 0, u' ')}
|
return {'/': (u' ', 0, u' ')}
|
||||||
|
|
||||||
ext = ''
|
ext = ''
|
||||||
if node.parent.type == dbo.TYPE['root']:
|
if node.parent.type == dbo.TYPE['root'] and self.use_color:
|
||||||
ext = colorize(' (%s)' % node.filepath, 'white')
|
ext = misc.colorize(' (%s)' % node.filepath, 'white')
|
||||||
|
|
||||||
path = []
|
path = []
|
||||||
path.append(node.filename)
|
path.append(node.filename)
|
||||||
@@ -140,8 +109,11 @@ class Iface(object):
|
|||||||
self.sess.commit()
|
self.sess.commit()
|
||||||
self.sess.close()
|
self.sess.close()
|
||||||
|
|
||||||
def list(self, path=None, recursive=False, long_=False):
|
def list(self, path=None, recursive=False, long_=False, mode='plain'):
|
||||||
"""Simulate ls command for the provided item path"""
|
"""Simulate ls command for the provided item path"""
|
||||||
|
if mode == 'mc':
|
||||||
|
self.use_color = False
|
||||||
|
|
||||||
self.root = self.sess.query(dbo.File)
|
self.root = self.sess.query(dbo.File)
|
||||||
self.root = self.root.filter(dbo.File.type == dbo.TYPE['root']).first()
|
self.root = self.root.filter(dbo.File.type == dbo.TYPE['root']).first()
|
||||||
if path:
|
if path:
|
||||||
@@ -151,16 +123,37 @@ class Iface(object):
|
|||||||
node = self.root
|
node = self.root
|
||||||
msg = "Content of path `/':"
|
msg = "Content of path `/':"
|
||||||
|
|
||||||
print colorize(msg, 'white')
|
if mode != 'mc':
|
||||||
|
print(misc.colorize(msg, 'white'))
|
||||||
|
|
||||||
if recursive:
|
if recursive:
|
||||||
items = self._walk(node)
|
items = self._walk(node)
|
||||||
else:
|
else:
|
||||||
items = self._list(node)
|
items = self._list(node)
|
||||||
|
|
||||||
if long_:
|
if mode == 'mc':
|
||||||
filenames = []
|
filenames = []
|
||||||
format_str = (u'{} {:>%d,} {} {}' %
|
format_str = ('{} 1 {} {} {:>%d} {} {}' %
|
||||||
|
len(str(sorted([i[1] for i in
|
||||||
|
items.values()])[-1])))
|
||||||
|
for fname in sorted(items.keys()):
|
||||||
|
type_, size, date = items[fname]
|
||||||
|
if type_ == 'd':
|
||||||
|
perms = 'drwxrwxrwx'
|
||||||
|
elif type_ == 'l':
|
||||||
|
perms = 'lrw-rw-rw-'
|
||||||
|
elif type_ == 'f':
|
||||||
|
perms = '-rw-rw-rw-'
|
||||||
|
else:
|
||||||
|
continue
|
||||||
|
filenames.append(format_str
|
||||||
|
.format(perms, os.getuid(),
|
||||||
|
os.getgid(), size,
|
||||||
|
date.strftime('%d/%m/%Y %H:%M:%S'),
|
||||||
|
fname))
|
||||||
|
elif long_:
|
||||||
|
filenames = []
|
||||||
|
format_str = ('{} {:>%d,} {} {}' %
|
||||||
_get_highest_size_length(items))
|
_get_highest_size_length(items))
|
||||||
for fname in sorted(items.keys()):
|
for fname in sorted(items.keys()):
|
||||||
type_, size, date = items[fname]
|
type_, size, date = items[fname]
|
||||||
@@ -168,7 +161,7 @@ class Iface(object):
|
|||||||
else:
|
else:
|
||||||
filenames = sorted(items.keys())
|
filenames = sorted(items.keys())
|
||||||
|
|
||||||
print '\n'.join(filenames)
|
print('\n'.join(filenames))
|
||||||
|
|
||||||
def update(self, path, dir_to_update=None):
|
def update(self, path, dir_to_update=None):
|
||||||
"""
|
"""
|
||||||
@@ -179,8 +172,8 @@ class Iface(object):
|
|||||||
self.root = self.root.filter(dbo.File.type == dbo.TYPE['root']).first()
|
self.root = self.root.filter(dbo.File.type == dbo.TYPE['root']).first()
|
||||||
node = self._resolve_path(path)
|
node = self._resolve_path(path)
|
||||||
if node == self.root:
|
if node == self.root:
|
||||||
print colorize('Cannot update entire db, since root was provided '
|
print(misc.colorize('Cannot update entire db, since root was '
|
||||||
'as path.', 'red')
|
'provided as path.', 'red'))
|
||||||
return
|
return
|
||||||
|
|
||||||
if not dir_to_update:
|
if not dir_to_update:
|
||||||
@@ -189,14 +182,14 @@ class Iface(object):
|
|||||||
if not os.path.exists(dir_to_update):
|
if not os.path.exists(dir_to_update):
|
||||||
raise OSError("Path to updtate doesn't exists: %s", dir_to_update)
|
raise OSError("Path to updtate doesn't exists: %s", dir_to_update)
|
||||||
|
|
||||||
print colorize("Updating node `%s' against directory "
|
print(misc.colorize("Updating node `%s' against directory "
|
||||||
"`%s'" % (path, dir_to_update), 'white')
|
"`%s'" % (path, dir_to_update), 'white'))
|
||||||
if not self.dry_run:
|
if not self.dry_run:
|
||||||
scanob = scan.Scan(dir_to_update)
|
scanob = scan.Scan(dir_to_update)
|
||||||
# scanob.update_files(node.id)
|
# scanob.update_files(node.id)
|
||||||
scanob.update_files(node.id, self.engine)
|
scanob.update_files(node.id, self.engine)
|
||||||
|
|
||||||
def create(self, dir_to_add, data_dir):
|
def create(self, dir_to_add, label=None):
|
||||||
"""Create new database"""
|
"""Create new database"""
|
||||||
self.root = dbo.File()
|
self.root = dbo.File()
|
||||||
self.root.id = 1
|
self.root.id = 1
|
||||||
@@ -206,27 +199,17 @@ class Iface(object):
|
|||||||
self.root.type = 0
|
self.root.type = 0
|
||||||
self.root.parent_id = 1
|
self.root.parent_id = 1
|
||||||
|
|
||||||
config = dbo.Config()
|
|
||||||
config.key = 'image_path'
|
|
||||||
config.value = data_dir
|
|
||||||
|
|
||||||
if not self.dry_run:
|
if not self.dry_run:
|
||||||
self.sess.add(self.root)
|
self.sess.add(self.root)
|
||||||
self.sess.add(config)
|
|
||||||
self.sess.commit()
|
self.sess.commit()
|
||||||
|
|
||||||
print colorize("Creating new db against directory `%s'" % dir_to_add,
|
print(misc.colorize("Creating new db against directory `%s'" %
|
||||||
'white')
|
dir_to_add, 'white'))
|
||||||
if not self.dry_run:
|
if not self.dry_run:
|
||||||
if data_dir == ':same_as_db:':
|
|
||||||
misc.calculate_image_path(None, True)
|
|
||||||
else:
|
|
||||||
misc.calculate_image_path(data_dir, True)
|
|
||||||
|
|
||||||
scanob = scan.Scan(dir_to_add)
|
scanob = scan.Scan(dir_to_add)
|
||||||
scanob.add_files(self.engine)
|
scanob.add_files(label=label)
|
||||||
|
|
||||||
def add(self, dir_to_add):
|
def add(self, dir_to_add, label=None):
|
||||||
"""Add new directory to the db"""
|
"""Add new directory to the db"""
|
||||||
self.root = self.sess.query(dbo.File)
|
self.root = self.sess.query(dbo.File)
|
||||||
self.root = self.root.filter(dbo.File.type == 0).first()
|
self.root = self.root.filter(dbo.File.type == 0).first()
|
||||||
@@ -234,10 +217,10 @@ class Iface(object):
|
|||||||
if not os.path.exists(dir_to_add):
|
if not os.path.exists(dir_to_add):
|
||||||
raise OSError("Path to add doesn't exists: %s", dir_to_add)
|
raise OSError("Path to add doesn't exists: %s", dir_to_add)
|
||||||
|
|
||||||
print colorize("Adding directory `%s'" % dir_to_add, 'white')
|
print(misc.colorize("Adding directory `%s'" % dir_to_add, 'white'))
|
||||||
if not self.dry_run:
|
if not self.dry_run:
|
||||||
scanob = scan.Scan(dir_to_add)
|
scanob = scan.Scan(dir_to_add)
|
||||||
scanob.add_files()
|
scanob.add_files(label=label)
|
||||||
|
|
||||||
def _annotate(self, item, search_words):
|
def _annotate(self, item, search_words):
|
||||||
"""
|
"""
|
||||||
@@ -257,12 +240,12 @@ class Iface(object):
|
|||||||
if idx in indexes:
|
if idx in indexes:
|
||||||
if not highlight:
|
if not highlight:
|
||||||
highlight = True
|
highlight = True
|
||||||
result.append(COLOR_SEQ % WHITE)
|
result.append(misc.COLOR_SEQ % misc.WHITE)
|
||||||
result.append(char)
|
result.append(char)
|
||||||
else:
|
else:
|
||||||
if highlight:
|
if highlight:
|
||||||
highlight = False
|
highlight = False
|
||||||
result.append(RESET_SEQ)
|
result.append(misc.RESET_SEQ)
|
||||||
result.append(char)
|
result.append(char)
|
||||||
|
|
||||||
return "".join(result)
|
return "".join(result)
|
||||||
@@ -273,119 +256,19 @@ class Iface(object):
|
|||||||
result = []
|
result = []
|
||||||
|
|
||||||
for word in search_words:
|
for word in search_words:
|
||||||
phrase = u'%%%s%%' % word.decode('utf-8')
|
phrase = u'%%%s%%' % word
|
||||||
query = query.filter(dbo.File.filename.like(phrase))
|
query = query.filter(dbo.File.filename.like(phrase))
|
||||||
|
|
||||||
for item in query.all():
|
for item in query.all():
|
||||||
result.append(self._get_full_path(item))
|
result.append(self._get_full_path(item))
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
print "No results for `%s'" % ' '.join(search_words)
|
print("No results for `%s'" % ' '.join(search_words))
|
||||||
return
|
return
|
||||||
|
|
||||||
result.sort()
|
result.sort()
|
||||||
for item in result:
|
for item in result:
|
||||||
print self._annotate(item, search_words)
|
print(self._annotate(item, search_words))
|
||||||
|
|
||||||
def fsck(self):
|
|
||||||
"""Fsck orphaned images/thumbs"""
|
|
||||||
image_path = self.sess.query(dbo.Config).\
|
|
||||||
filter(dbo.Config.key=='image_path').one().value
|
|
||||||
|
|
||||||
if image_path == ':same_as_db:':
|
|
||||||
image_path = misc.calculate_image_path(None, False)
|
|
||||||
|
|
||||||
files_to_remove = []
|
|
||||||
obj_to_remove = []
|
|
||||||
|
|
||||||
# remove images/thumbnails which doesn't have file relation
|
|
||||||
for name, obj in (("images", dbo.Image),
|
|
||||||
("thumbnails", dbo.Thumbnail)):
|
|
||||||
self._purge_orphaned_objects(obj, "Scanning %s " % name)
|
|
||||||
|
|
||||||
# find all image files not associate with either Image (image/thumb)
|
|
||||||
# or Thumbnail (thumb) objects
|
|
||||||
sys.stdout.write(40 * " " + "\r")
|
|
||||||
count = 0
|
|
||||||
for root, dirs, files in os.walk(image_path):
|
|
||||||
for fname in files:
|
|
||||||
sys.stdout.write("Scanning files " +
|
|
||||||
"| / - \\".split()[count % 4] + "\r")
|
|
||||||
sys.stdout.flush()
|
|
||||||
count += 1
|
|
||||||
|
|
||||||
fname_ = os.path.join(root.split(image_path)[1],
|
|
||||||
fname).lstrip('/')
|
|
||||||
|
|
||||||
if '_t' in fname:
|
|
||||||
obj = self.sess.query(dbo.Thumbnail)\
|
|
||||||
.filter(dbo.Thumbnail.filename==fname_).all()
|
|
||||||
if obj:
|
|
||||||
continue
|
|
||||||
|
|
||||||
obj = self.sess.query(dbo.Image)\
|
|
||||||
.filter(dbo.Image.filename==\
|
|
||||||
fname_.replace('_t.', '.')).all()
|
|
||||||
if obj:
|
|
||||||
continue
|
|
||||||
|
|
||||||
else:
|
|
||||||
obj = self.sess.query(dbo.Image)\
|
|
||||||
.filter(dbo.Image.filename==fname_).all()
|
|
||||||
if obj:
|
|
||||||
continue
|
|
||||||
|
|
||||||
files_to_remove.append(os.path.join(root, fname))
|
|
||||||
|
|
||||||
LOG.debug("Found %d orphaned files", len(files_to_remove))
|
|
||||||
sys.stdout.write(40 * " " + "\r")
|
|
||||||
sys.stdout.flush()
|
|
||||||
|
|
||||||
if self.dry_run:
|
|
||||||
print "Following files are not associated to any items in the DB:"
|
|
||||||
for filename in sorted(files_to_remove):
|
|
||||||
print filename
|
|
||||||
self.sess.rollback()
|
|
||||||
else:
|
|
||||||
_remove_files(image_path, files_to_remove)
|
|
||||||
self.sess.commit()
|
|
||||||
|
|
||||||
def _purge_orphaned_objects(self, sa_class, msg):
|
|
||||||
"""Return tuple of lists of images that are orphaned"""
|
|
||||||
|
|
||||||
ids_to_remove = []
|
|
||||||
|
|
||||||
for count, item in enumerate(self.sess.query(sa_class).all()):
|
|
||||||
sys.stdout.write(msg + "| / - \\".split()[count % 4] + "\r")
|
|
||||||
if not item.file:
|
|
||||||
self.sess.delete(item)
|
|
||||||
ids_to_remove.append(item.id)
|
|
||||||
del item
|
|
||||||
sys.stdout.flush()
|
|
||||||
|
|
||||||
LOG.debug("Found %d orphaned object of class %s",
|
|
||||||
len(ids_to_remove), sa_class.__name__)
|
|
||||||
self.sess.flush()
|
|
||||||
|
|
||||||
|
|
||||||
def _remove_files(image_path, filenames):
|
|
||||||
"""Remove files and empty directories in provided location"""
|
|
||||||
|
|
||||||
count = 0
|
|
||||||
for count, fname in enumerate(filenames, start=1):
|
|
||||||
os.unlink(fname)
|
|
||||||
|
|
||||||
LOG.info("Removed %d orphaned files", count)
|
|
||||||
|
|
||||||
count = 0
|
|
||||||
for root, dirs, _ in os.walk(image_path):
|
|
||||||
for dirname in dirs:
|
|
||||||
try:
|
|
||||||
os.rmdir(os.path.join(root, dirname))
|
|
||||||
count += 1
|
|
||||||
except OSError:
|
|
||||||
pass
|
|
||||||
LOG.info("Removed %d empty directories", count)
|
|
||||||
|
|
||||||
|
|
||||||
def _get_highest_size_length(item_dict):
|
def _get_highest_size_length(item_dict):
|
||||||
@@ -393,15 +276,19 @@ def _get_highest_size_length(item_dict):
|
|||||||
return highest + highest / 3
|
return highest + highest / 3
|
||||||
|
|
||||||
|
|
||||||
@asserdb
|
@misc.asserdb
|
||||||
def list_db(args):
|
def list_db(args):
|
||||||
"""List"""
|
"""List"""
|
||||||
|
if args.mode == 'mc':
|
||||||
|
LOG.setLevel(100) # supress logging
|
||||||
|
|
||||||
obj = Iface(args.db, False, args.debug)
|
obj = Iface(args.db, False, args.debug)
|
||||||
obj.list(path=args.path, recursive=args.recursive, long_=args.long)
|
obj.list(path=args.path, recursive=args.recursive, long_=args.long,
|
||||||
|
mode=args.mode)
|
||||||
obj.close()
|
obj.close()
|
||||||
|
|
||||||
|
|
||||||
@asserdb
|
@misc.asserdb
|
||||||
def update_db(args):
|
def update_db(args):
|
||||||
"""Update"""
|
"""Update"""
|
||||||
obj = Iface(args.db, args.pretend, args.debug)
|
obj = Iface(args.db, args.pretend, args.debug)
|
||||||
@@ -409,36 +296,28 @@ def update_db(args):
|
|||||||
obj.close()
|
obj.close()
|
||||||
|
|
||||||
|
|
||||||
@asserdb
|
@misc.asserdb
|
||||||
def add_dir(args):
|
def add_dir(args):
|
||||||
"""Add"""
|
"""Add"""
|
||||||
obj = Iface(args.db, args.pretend, args.debug)
|
obj = Iface(args.db, args.pretend, args.debug)
|
||||||
obj.add(args.dir_to_add)
|
obj.add(args.dir_to_add, args.label)
|
||||||
obj.close()
|
obj.close()
|
||||||
|
|
||||||
|
|
||||||
@asserdb
|
|
||||||
def create_db(args):
|
def create_db(args):
|
||||||
"""List"""
|
"""List"""
|
||||||
obj = Iface(args.db, args.pretend, args.debug)
|
obj = Iface(args.db, args.pretend, args.debug)
|
||||||
obj.create(args.dir_to_add, args.imagedir)
|
obj.create(args.dir_to_add, args.label)
|
||||||
obj.close()
|
obj.close()
|
||||||
|
|
||||||
@asserdb
|
|
||||||
|
@misc.asserdb
|
||||||
def search(args):
|
def search(args):
|
||||||
"""Find"""
|
"""Find"""
|
||||||
obj = Iface(args.db, False, args.debug)
|
obj = Iface(args.db, False, args.debug)
|
||||||
obj.find(args.search_words)
|
obj.find(args.search_words)
|
||||||
obj.close()
|
obj.close()
|
||||||
|
|
||||||
@asserdb
|
|
||||||
def cleanup(args):
|
|
||||||
"""Cleanup"""
|
|
||||||
obj = Iface(args.db, False, args.debug)
|
|
||||||
obj.fsck()
|
|
||||||
obj.close()
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
"""Main"""
|
"""Main"""
|
||||||
@@ -448,6 +327,12 @@ def main():
|
|||||||
list_ = subparser.add_parser('list')
|
list_ = subparser.add_parser('list')
|
||||||
list_.add_argument('db')
|
list_.add_argument('db')
|
||||||
list_.add_argument('path', nargs='?')
|
list_.add_argument('path', nargs='?')
|
||||||
|
list_.add_argument('-m', '--mode', help='List items using mode. By '
|
||||||
|
'default is simply plain mode, other possibility is to '
|
||||||
|
'use "mc" mode, which is suitable to use with extfs '
|
||||||
|
'plugin', default='plain')
|
||||||
|
list_.add_argument('-c', '--color', help='Use colors for listing',
|
||||||
|
action='store_true', default=False)
|
||||||
list_.add_argument('-l', '--long', help='Show size, date and type',
|
list_.add_argument('-l', '--long', help='Show size, date and type',
|
||||||
action='store_true', default=False)
|
action='store_true', default=False)
|
||||||
list_.add_argument('-r', '--recursive', help='list items in '
|
list_.add_argument('-r', '--recursive', help='list items in '
|
||||||
@@ -470,12 +355,8 @@ def main():
|
|||||||
create = subparser.add_parser('create')
|
create = subparser.add_parser('create')
|
||||||
create.add_argument('db')
|
create.add_argument('db')
|
||||||
create.add_argument('dir_to_add')
|
create.add_argument('dir_to_add')
|
||||||
create.add_argument('-i', '--imagedir', help="Directory where to put "
|
create.add_argument('-l', '--label', help='Add label as the root item of '
|
||||||
"images for the database. Popular, but deprecated "
|
'the added directory')
|
||||||
"choice is `~/.pygtktalog/images'. Currnet default "
|
|
||||||
"is special string `:same_as_db:' which will try to "
|
|
||||||
"create directory with the same name as the db with "
|
|
||||||
"data suffix", default=':same_as_db:')
|
|
||||||
create.add_argument('-p', '--pretend', help="Don't do the action, just "
|
create.add_argument('-p', '--pretend', help="Don't do the action, just "
|
||||||
"give the info what would gonna to happen.",
|
"give the info what would gonna to happen.",
|
||||||
action='store_true', default=False)
|
action='store_true', default=False)
|
||||||
@@ -491,6 +372,8 @@ def main():
|
|||||||
action='store_true', default=False)
|
action='store_true', default=False)
|
||||||
add.add_argument('-d', '--debug', help='Turn on debug',
|
add.add_argument('-d', '--debug', help='Turn on debug',
|
||||||
action='store_true', default=False)
|
action='store_true', default=False)
|
||||||
|
add.add_argument('-l', '--label', help='Add label as the root item of the '
|
||||||
|
'added directory')
|
||||||
add.set_defaults(func=add_dir)
|
add.set_defaults(func=add_dir)
|
||||||
|
|
||||||
find = subparser.add_parser('find')
|
find = subparser.add_parser('find')
|
||||||
@@ -500,17 +383,13 @@ def main():
|
|||||||
action='store_true', default=False)
|
action='store_true', default=False)
|
||||||
find.set_defaults(func=search)
|
find.set_defaults(func=search)
|
||||||
|
|
||||||
fsck = subparser.add_parser('fsck')
|
|
||||||
fsck.add_argument('db')
|
|
||||||
fsck.add_argument('-p', '--pretend', help="Don't do the action, just give"
|
|
||||||
" the info what would gonna to happen.",
|
|
||||||
action='store_true', default=False)
|
|
||||||
fsck.add_argument('-d', '--debug', help='Turn on debug',
|
|
||||||
action='store_true', default=False)
|
|
||||||
fsck.set_defaults(func=cleanup)
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
args.func(args)
|
|
||||||
|
if 'func' in args:
|
||||||
|
args.func(args)
|
||||||
|
else:
|
||||||
|
parser.print_help()
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
main()
|
main()
|
||||||
@@ -9,21 +9,16 @@ from sqlalchemy import MetaData, create_engine
|
|||||||
from sqlalchemy.orm import sessionmaker
|
from sqlalchemy.orm import sessionmaker
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
from pygtktalog.logger import get_logger
|
from pycatalog.logger import get_logger
|
||||||
|
|
||||||
|
|
||||||
# setup SQLAlchemy logging facility
|
|
||||||
# TODO: Logger("sqlalchemy")
|
|
||||||
# or maybe it will be better to separate sqlalchemy stuff from application
|
|
||||||
#get_logger("sqlalchemy", 'INFO')
|
|
||||||
|
|
||||||
# Prepare SQLAlchemy objects
|
# Prepare SQLAlchemy objects
|
||||||
Meta = MetaData()
|
Meta = MetaData()
|
||||||
Base = declarative_base(metadata=Meta)
|
Base = declarative_base(metadata=Meta)
|
||||||
Session = sessionmaker()
|
Session = sessionmaker()
|
||||||
DbFilename = None
|
DbFilename = None
|
||||||
|
|
||||||
LOG = get_logger("dbcommon")
|
LOG = get_logger()
|
||||||
|
|
||||||
|
|
||||||
def connect(filename=None):
|
def connect(filename=None):
|
||||||
@@ -12,13 +12,11 @@ from sqlalchemy import Column, Table, Integer, Text
|
|||||||
from sqlalchemy import DateTime, ForeignKey, Sequence
|
from sqlalchemy import DateTime, ForeignKey, Sequence
|
||||||
from sqlalchemy.orm import relation, backref
|
from sqlalchemy.orm import relation, backref
|
||||||
|
|
||||||
from pygtktalog.dbcommon import Base
|
from pycatalog.dbcommon import Base
|
||||||
from pygtktalog.thumbnail import ThumbCreator
|
from pycatalog.logger import get_logger
|
||||||
from pygtktalog.logger import get_logger
|
|
||||||
from pygtktalog.misc import mk_paths
|
|
||||||
|
|
||||||
|
|
||||||
LOG = get_logger(__name__)
|
LOG = get_logger()
|
||||||
|
|
||||||
tags_files = Table("tags_files", Base.metadata,
|
tags_files = Table("tags_files", Base.metadata,
|
||||||
Column("file_id", Integer, ForeignKey("files.id")),
|
Column("file_id", Integer, ForeignKey("files.id")),
|
||||||
@@ -49,8 +47,6 @@ class File(Base):
|
|||||||
backref=backref('parent', remote_side="File.id"),
|
backref=backref('parent', remote_side="File.id"),
|
||||||
order_by=[type, filename])
|
order_by=[type, filename])
|
||||||
tags = relation("Tag", secondary=tags_files, order_by="Tag.tag")
|
tags = relation("Tag", secondary=tags_files, order_by="Tag.tag")
|
||||||
thumbnail = relation("Thumbnail", backref="file")
|
|
||||||
images = relation("Image", backref="file", order_by="Image.filename")
|
|
||||||
|
|
||||||
def __init__(self, filename=None, path=None, date=None, size=None,
|
def __init__(self, filename=None, path=None, date=None, size=None,
|
||||||
ftype=None, src=None):
|
ftype=None, src=None):
|
||||||
@@ -63,8 +59,7 @@ class File(Base):
|
|||||||
self.source = src
|
self.source = src
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return "<File('%s', %s)>" % (self.filename.encode('utf-8'),
|
return "<File('%s', %s)>" % (self.filename, str(self.id))
|
||||||
str(self.id))
|
|
||||||
|
|
||||||
def get_all_children(self):
|
def get_all_children(self):
|
||||||
"""
|
"""
|
||||||
@@ -109,7 +104,7 @@ class Tag(Base):
|
|||||||
tag = Column(Text)
|
tag = Column(Text)
|
||||||
group = relation('Group', backref=backref('tags', remote_side="Group.id"))
|
group = relation('Group', backref=backref('tags', remote_side="Group.id"))
|
||||||
|
|
||||||
files = relation("File", secondary=tags_files)
|
files = relation("File", secondary=tags_files, back_populates="tags")
|
||||||
|
|
||||||
def __init__(self, tag=None, group=None):
|
def __init__(self, tag=None, group=None):
|
||||||
self.tag = tag
|
self.tag = tag
|
||||||
@@ -119,112 +114,6 @@ class Tag(Base):
|
|||||||
return "<Tag('%s', %s)>" % (str(self.tag), str(self.id))
|
return "<Tag('%s', %s)>" % (str(self.tag), str(self.id))
|
||||||
|
|
||||||
|
|
||||||
class Thumbnail(Base):
|
|
||||||
"""Thumbnail for the file"""
|
|
||||||
__tablename__ = "thumbnails"
|
|
||||||
id = Column(Integer, Sequence("thumbnail_id_seq"), primary_key=True)
|
|
||||||
file_id = Column(Integer, ForeignKey("files.id"), index=True)
|
|
||||||
filename = Column(Text)
|
|
||||||
|
|
||||||
def __init__(self, filename=None, img_path=None, file_obj=None):
|
|
||||||
self.filename = filename
|
|
||||||
self.file = file_obj
|
|
||||||
self.img_path = img_path
|
|
||||||
if filename and file_obj and img_path:
|
|
||||||
self.save(self.filename, img_path)
|
|
||||||
|
|
||||||
def save(self, fname, img_path):
|
|
||||||
"""
|
|
||||||
Create file related thumbnail, add it to the file object.
|
|
||||||
"""
|
|
||||||
new_name = mk_paths(fname, img_path)
|
|
||||||
ext = os.path.splitext(self.filename)[1]
|
|
||||||
if ext:
|
|
||||||
new_name.append("".join([new_name.pop(), ext]))
|
|
||||||
|
|
||||||
thumb = ThumbCreator(self.filename).generate()
|
|
||||||
name, ext = os.path.splitext(new_name.pop())
|
|
||||||
new_name.append("".join([name, "_t", ext]))
|
|
||||||
self.filename = os.path.sep.join(new_name)
|
|
||||||
if not os.path.exists(os.path.join(img_path, *new_name)):
|
|
||||||
shutil.move(thumb, os.path.join(img_path, *new_name))
|
|
||||||
else:
|
|
||||||
LOG.info("Thumbnail already exists (%s: %s)" % \
|
|
||||||
(fname, "/".join(new_name)))
|
|
||||||
os.unlink(thumb)
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return "<Thumbnail('%s', %s)>" % (str(self.filename), str(self.id))
|
|
||||||
|
|
||||||
|
|
||||||
class Image(Base):
|
|
||||||
"""Images and their thumbnails"""
|
|
||||||
__tablename__ = "images"
|
|
||||||
id = Column(Integer, Sequence("images_id_seq"), primary_key=True)
|
|
||||||
file_id = Column(Integer, ForeignKey("files.id"), index=True)
|
|
||||||
filename = Column(Text)
|
|
||||||
|
|
||||||
def __init__(self, filename=None, img_path=None, file_obj=None, move=True):
|
|
||||||
self.filename = None
|
|
||||||
self.file = file_obj
|
|
||||||
self.img_path = img_path
|
|
||||||
if filename and img_path:
|
|
||||||
self.filename = filename
|
|
||||||
self.save(filename, img_path, move)
|
|
||||||
|
|
||||||
def save(self, fname, img_path, move=True):
|
|
||||||
"""
|
|
||||||
Save and create coressponding thumbnail (note: it differs from file
|
|
||||||
related thumbnail!)
|
|
||||||
"""
|
|
||||||
new_name = mk_paths(fname, img_path)
|
|
||||||
ext = os.path.splitext(self.filename)[1]
|
|
||||||
|
|
||||||
if ext:
|
|
||||||
new_name.append("".join([new_name.pop(), ext]))
|
|
||||||
|
|
||||||
if not os.path.exists(os.path.join(img_path, *new_name)):
|
|
||||||
if move:
|
|
||||||
shutil.move(self.filename, os.path.join(img_path, *new_name))
|
|
||||||
else:
|
|
||||||
shutil.copy(self.filename, os.path.join(img_path, *new_name))
|
|
||||||
else:
|
|
||||||
LOG.warning("Image with same CRC already exists "
|
|
||||||
"('%s', '%s')" % (self.filename, "/".join(new_name)))
|
|
||||||
|
|
||||||
self.filename = os.path.sep.join(new_name)
|
|
||||||
|
|
||||||
name, ext = os.path.splitext(new_name.pop())
|
|
||||||
new_name.append("".join([name, "_t", ext]))
|
|
||||||
|
|
||||||
if not os.path.exists(os.path.join(img_path, *new_name)):
|
|
||||||
thumb = ThumbCreator(os.path.join(img_path, self.filename))
|
|
||||||
shutil.move(thumb.generate(), os.path.join(img_path, *new_name))
|
|
||||||
else:
|
|
||||||
LOG.info("Thumbnail already generated %s" % "/".join(new_name))
|
|
||||||
|
|
||||||
|
|
||||||
def get_copy(self):
|
|
||||||
"""
|
|
||||||
Create the very same object as self with exception of id field
|
|
||||||
"""
|
|
||||||
img = Image()
|
|
||||||
img.filename = self.filename
|
|
||||||
return img
|
|
||||||
|
|
||||||
@property
|
|
||||||
def thumbnail(self):
|
|
||||||
"""
|
|
||||||
Return path to thumbnail for this image
|
|
||||||
"""
|
|
||||||
path, fname = os.path.split(self.filename)
|
|
||||||
base, ext = os.path.splitext(fname)
|
|
||||||
return os.path.join(path, base + "_t" + ext)
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return "<Image('%s', %s)>" % (str(self.filename), str(self.id))
|
|
||||||
|
|
||||||
|
|
||||||
class Exif(Base):
|
class Exif(Base):
|
||||||
"""Selected EXIF information"""
|
"""Selected EXIF information"""
|
||||||
__tablename__ = "exif"
|
__tablename__ = "exif"
|
||||||
@@ -27,6 +27,7 @@ COLORS = {'WARNING': YELLOW,
|
|||||||
'CRITICAL': WHITE,
|
'CRITICAL': WHITE,
|
||||||
'ERROR': RED}
|
'ERROR': RED}
|
||||||
|
|
||||||
|
|
||||||
def cprint(txt, color):
|
def cprint(txt, color):
|
||||||
color_map = {"black": BLACK,
|
color_map = {"black": BLACK,
|
||||||
"red": RED,
|
"red": RED,
|
||||||
@@ -36,7 +37,7 @@ def cprint(txt, color):
|
|||||||
"magenta": MAGENTA,
|
"magenta": MAGENTA,
|
||||||
"cyan": CYAN,
|
"cyan": CYAN,
|
||||||
"white": WHITE}
|
"white": WHITE}
|
||||||
print COLOR_SEQ % (30 + color_map[color]) + txt + RESET_SEQ
|
print(COLOR_SEQ % (30 + color_map[color]) + txt + RESET_SEQ)
|
||||||
|
|
||||||
|
|
||||||
class DummyFormater(logging.Formatter):
|
class DummyFormater(logging.Formatter):
|
||||||
@@ -58,36 +59,34 @@ class ColoredFormatter(logging.Formatter):
|
|||||||
record.levelname = levelname_color
|
record.levelname = levelname_color
|
||||||
return logging.Formatter.format(self, record)
|
return logging.Formatter.format(self, record)
|
||||||
|
|
||||||
|
|
||||||
log_obj = None
|
log_obj = None
|
||||||
|
|
||||||
#def get_logger(module_name, level='INFO', to_file=False):
|
|
||||||
#def get_logger(module_name, level='DEBUG', to_file=True):
|
def get_logger(level='INFO', to_file=True, to_console=True):
|
||||||
def get_logger(module_name, level='INFO', to_file=True, to_console=True):
|
|
||||||
# def get_logger(module_name, level='DEBUG', to_file=True, to_console=True):
|
|
||||||
#def get_logger(module_name, level='DEBUG', to_file=False):
|
|
||||||
"""
|
"""
|
||||||
Prepare and return log object. Standard formatting is used for all logs.
|
Prepare and return log object. Standard formatting is used for all logs.
|
||||||
Arguments:
|
Arguments:
|
||||||
@module_name - String name for Logger object.
|
|
||||||
@level - Log level (as string), one of DEBUG, INFO, WARN, ERROR and
|
@level - Log level (as string), one of DEBUG, INFO, WARN, ERROR and
|
||||||
CRITICAL.
|
CRITICAL.
|
||||||
@to_file - If True, additionally stores full log in file inside
|
@to_file - If True, additionally stores full log in file inside
|
||||||
.pygtktalog config directory and to stderr, otherwise log
|
.pycatalog config directory and to stderr, otherwise log
|
||||||
is only redirected to stderr.
|
is only redirected to stderr.
|
||||||
Returns: object of logging.Logger class
|
Returns: object of logging.Logger class
|
||||||
"""
|
"""
|
||||||
|
global log_obj
|
||||||
|
if log_obj:
|
||||||
|
return log_obj
|
||||||
|
|
||||||
path = os.path.join(os.path.expanduser("~"), ".pygtktalog", "app.log")
|
path = os.path.join(os.path.expanduser("~"), ".pycatalog", "app.log")
|
||||||
|
|
||||||
log = logging.getLogger(module_name)
|
log = logging.getLogger("pycatalog")
|
||||||
log.setLevel(LEVEL[level])
|
log.setLevel(LEVEL[level])
|
||||||
|
|
||||||
if to_console:
|
if to_console:
|
||||||
#path = "/dev/null"
|
|
||||||
|
|
||||||
console_handler = logging.StreamHandler(sys.stderr)
|
console_handler = logging.StreamHandler(sys.stderr)
|
||||||
console_formatter = ColoredFormatter("%(filename)s:%(lineno)s - "
|
console_formatter = ColoredFormatter("%(filename)s:%(lineno)s - "
|
||||||
"%(levelname)s - %(message)s")
|
"%(levelname)s - %(message)s")
|
||||||
console_handler.setFormatter(console_formatter)
|
console_handler.setFormatter(console_formatter)
|
||||||
|
|
||||||
log.addHandler(console_handler)
|
log.addHandler(console_handler)
|
||||||
@@ -107,4 +106,5 @@ def get_logger(module_name, level='INFO', to_file=True, to_console=True):
|
|||||||
dummy_handler.setFormatter(dummy_formatter)
|
dummy_handler.setFormatter(dummy_formatter)
|
||||||
log.addHandler(dummy_handler)
|
log.addHandler(dummy_handler)
|
||||||
|
|
||||||
|
log_obj = log
|
||||||
return log
|
return log
|
||||||
56
pycatalog/misc.py
Normal file
56
pycatalog/misc.py
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
"""
|
||||||
|
Project: pyGTKtalog
|
||||||
|
Description: Misc functions used more than once in src
|
||||||
|
Type: lib
|
||||||
|
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
||||||
|
Created: 2009-04-05
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from pycatalog import logger
|
||||||
|
|
||||||
|
BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = range(30, 38)
|
||||||
|
|
||||||
|
RESET_SEQ = '\033[0m'
|
||||||
|
COLOR_SEQ = '\033[1;%dm'
|
||||||
|
BOLD_SEQ = '\033[1m'
|
||||||
|
|
||||||
|
LOG = logger.get_logger()
|
||||||
|
|
||||||
|
|
||||||
|
def colorize(txt, color):
|
||||||
|
"""Pretty print with colors to console."""
|
||||||
|
color_map = {'black': BLACK,
|
||||||
|
'red': RED,
|
||||||
|
'green': GREEN,
|
||||||
|
'yellow': YELLOW,
|
||||||
|
'blue': BLUE,
|
||||||
|
'magenta': MAGENTA,
|
||||||
|
'cyan': CYAN,
|
||||||
|
'white': WHITE}
|
||||||
|
return COLOR_SEQ % color_map[color] + txt + RESET_SEQ
|
||||||
|
|
||||||
|
|
||||||
|
def float_to_string(float_length):
|
||||||
|
"""
|
||||||
|
Parse float digit into time string
|
||||||
|
Arguments:
|
||||||
|
@number - digit to be converted into time.
|
||||||
|
Returns HH:MM:SS formatted string
|
||||||
|
"""
|
||||||
|
hour = int(float_length / 3600)
|
||||||
|
float_length -= hour*3600
|
||||||
|
minutes = int(float_length / 60)
|
||||||
|
float_length -= minutes * 60
|
||||||
|
sec = int(float_length)
|
||||||
|
return f"{hour:02}:{minutes:02}:{sec:02}"
|
||||||
|
|
||||||
|
|
||||||
|
def asserdb(func):
|
||||||
|
def wrapper(args):
|
||||||
|
if not os.path.exists(args.db):
|
||||||
|
print(colorize("File `%s' does not exists!" % args.db, 'red'))
|
||||||
|
sys.exit(1)
|
||||||
|
func(args)
|
||||||
|
return wrapper
|
||||||
@@ -6,19 +6,20 @@
|
|||||||
Created: 2011-03-27
|
Created: 2011-03-27
|
||||||
"""
|
"""
|
||||||
import os
|
import os
|
||||||
import sys
|
|
||||||
import re
|
import re
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import mimetypes
|
import mimetypes
|
||||||
|
|
||||||
import pygtktalog.misc
|
import exifread
|
||||||
from pygtktalog.dbobjects import File, Image, Thumbnail, Config, TYPE
|
import mutagen
|
||||||
from pygtktalog.dbcommon import Session
|
|
||||||
from pygtktalog.logger import get_logger
|
from pycatalog.dbobjects import File, TYPE
|
||||||
from pygtktalog.video import Video
|
from pycatalog import dbcommon
|
||||||
|
from pycatalog.logger import get_logger
|
||||||
|
from pycatalog.video import Video
|
||||||
|
|
||||||
|
|
||||||
LOG = get_logger(__name__)
|
LOG = get_logger()
|
||||||
RE_FN_START = re.compile(r'(?P<fname_start>'
|
RE_FN_START = re.compile(r'(?P<fname_start>'
|
||||||
r'(\[[^\]]*\]\s)?'
|
r'(\[[^\]]*\]\s)?'
|
||||||
r'([^(]*)\s'
|
r'([^(]*)\s'
|
||||||
@@ -26,10 +27,8 @@ RE_FN_START = re.compile(r'(?P<fname_start>'
|
|||||||
r'(\[[A-Fa-f0-9]{8}\])\..*')
|
r'(\[[A-Fa-f0-9]{8}\])\..*')
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
class NoAccessError(Exception):
|
class NoAccessError(Exception):
|
||||||
"""No access exception"""
|
"""No access exception"""
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class Scan(object):
|
class Scan(object):
|
||||||
@@ -47,13 +46,11 @@ class Scan(object):
|
|||||||
self._files = []
|
self._files = []
|
||||||
self._existing_files = [] # for re-use purpose in adding
|
self._existing_files = [] # for re-use purpose in adding
|
||||||
self._existing_branch = [] # for branch storage, mainly for updating
|
self._existing_branch = [] # for branch storage, mainly for updating
|
||||||
self._session = Session()
|
self._session = dbcommon.Session()
|
||||||
self.files_count = self._get_files_count()
|
self.files_count = self._get_files_count()
|
||||||
self.current_count = 0
|
self.current_count = 0
|
||||||
|
|
||||||
self._set_image_path()
|
def add_files(self, label=None):
|
||||||
|
|
||||||
def add_files(self, engine=None):
|
|
||||||
"""
|
"""
|
||||||
Returns list, which contain object, modification date and file
|
Returns list, which contain object, modification date and file
|
||||||
size.
|
size.
|
||||||
@@ -77,6 +74,8 @@ class Scan(object):
|
|||||||
|
|
||||||
# add only first item from _files, because it is a root of the other,
|
# add only first item from _files, because it is a root of the other,
|
||||||
# so other will be automatically added aswell.
|
# so other will be automatically added aswell.
|
||||||
|
if label:
|
||||||
|
self._files[0].filename = label
|
||||||
self._session.add(self._files[0])
|
self._session.add(self._files[0])
|
||||||
self._session.commit()
|
self._session.commit()
|
||||||
return self._files
|
return self._files
|
||||||
@@ -114,7 +113,7 @@ class Scan(object):
|
|||||||
# number of objects to retrieve at once. Limit is 999. Let's do a
|
# number of objects to retrieve at once. Limit is 999. Let's do a
|
||||||
# little bit below.
|
# little bit below.
|
||||||
num = 900
|
num = 900
|
||||||
steps = len(all_ids) / num + 1
|
steps = len(all_ids) // num + 1
|
||||||
for step in range(steps):
|
for step in range(steps):
|
||||||
all_obj.extend(self._session
|
all_obj.extend(self._session
|
||||||
.query(File)
|
.query(File)
|
||||||
@@ -155,7 +154,7 @@ class Scan(object):
|
|||||||
# in case of such, better get me a byte string. It is not perfect
|
# in case of such, better get me a byte string. It is not perfect
|
||||||
# though, since it WILL crash if the update_path would contain some
|
# though, since it WILL crash if the update_path would contain some
|
||||||
# unconvertable characters.
|
# unconvertable characters.
|
||||||
update_path = update_path.encode("utf-8")
|
update_path = update_path
|
||||||
|
|
||||||
# refresh objects
|
# refresh objects
|
||||||
LOG.debug("Refreshing objects")
|
LOG.debug("Refreshing objects")
|
||||||
@@ -181,8 +180,8 @@ class Scan(object):
|
|||||||
# self._session.merge(self._files[0])
|
# self._session.merge(self._files[0])
|
||||||
LOG.debug("Deleting objects whitout parent: %s",
|
LOG.debug("Deleting objects whitout parent: %s",
|
||||||
str(self._session.query(File)
|
str(self._session.query(File)
|
||||||
.filter(File.parent==None).all()))
|
.filter(File.parent.is_(None)).all()))
|
||||||
self._session.query(File).filter(File.parent==None).delete()
|
self._session.query(File).filter(File.parent.is_(None)).delete()
|
||||||
|
|
||||||
self._session.commit()
|
self._session.commit()
|
||||||
return self._files
|
return self._files
|
||||||
@@ -199,8 +198,7 @@ class Scan(object):
|
|||||||
'.ogm': 'video',
|
'.ogm': 'video',
|
||||||
'.ogv': 'video'}
|
'.ogv': 'video'}
|
||||||
|
|
||||||
fp = os.path.join(fobj.filepath.encode(sys.getfilesystemencoding()),
|
fp = os.path.join(fobj.filepath, fobj.filename)
|
||||||
fobj.filename.encode(sys.getfilesystemencoding()))
|
|
||||||
|
|
||||||
mimeinfo = mimetypes.guess_type(fp)
|
mimeinfo = mimetypes.guess_type(fp)
|
||||||
if mimeinfo[0]:
|
if mimeinfo[0]:
|
||||||
@@ -208,7 +206,7 @@ class Scan(object):
|
|||||||
|
|
||||||
ext = os.path.splitext(fp)[1]
|
ext = os.path.splitext(fp)[1]
|
||||||
|
|
||||||
if mimeinfo and mimeinfo in mimedict.keys():
|
if mimeinfo and mimeinfo in mimedict:
|
||||||
mimedict[mimeinfo](fobj, fp)
|
mimedict[mimeinfo](fobj, fp)
|
||||||
elif ext and ext in extdict:
|
elif ext and ext in extdict:
|
||||||
mimedict[extdict[ext]](fobj, fp)
|
mimedict[extdict[ext]](fobj, fp)
|
||||||
@@ -217,66 +215,35 @@ class Scan(object):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
def _audio(self, fobj, filepath):
|
def _audio(self, fobj, filepath):
|
||||||
# LOG.warning('audio')
|
tags = mutagen.File(filepath)
|
||||||
return
|
if not tags:
|
||||||
|
return
|
||||||
|
fobj.description = tags.pprint()
|
||||||
|
|
||||||
def _image(self, fobj, filepath):
|
def _image(self, fobj, filepath):
|
||||||
# LOG.warning('image')
|
"""Read exif if exists, add it to description"""
|
||||||
return
|
with open(filepath, 'rb') as obj:
|
||||||
|
exif = exifread.process_file(obj)
|
||||||
|
if not exif:
|
||||||
|
return
|
||||||
|
|
||||||
|
data = []
|
||||||
|
# longest key + 2, since we need a colon and a space after it
|
||||||
|
longest_key = max([len(k) for k in exif]) + 2
|
||||||
|
for key in exif:
|
||||||
|
if 'thumbnail' in key.lower() and isinstance(exif[key], bytes):
|
||||||
|
data.append(f"{key + ':' :<{longest_key}}thumbnail present")
|
||||||
|
continue
|
||||||
|
data.append(f"{key + ':' :<{longest_key}}{exif[key]}")
|
||||||
|
fobj.description = "\n".join(data)
|
||||||
|
|
||||||
def _video(self, fobj, filepath):
|
def _video(self, fobj, filepath):
|
||||||
"""
|
"""
|
||||||
Make captures for a movie. Save it under uniq name.
|
Make captures for a movie. Save it under uniq name.
|
||||||
"""
|
"""
|
||||||
result = RE_FN_START.match(fobj.filename)
|
|
||||||
if result:
|
|
||||||
self._check_related(fobj, result.groupdict()['fname_start'])
|
|
||||||
|
|
||||||
vid = Video(filepath)
|
vid = Video(filepath)
|
||||||
|
|
||||||
fobj.description = vid.get_formatted_tags()
|
fobj.description = vid.get_formatted_tags()
|
||||||
|
|
||||||
preview_fn = vid.capture()
|
|
||||||
if preview_fn:
|
|
||||||
Image(preview_fn, self.img_path, fobj)
|
|
||||||
|
|
||||||
def _check_related(self, fobj, filename_start):
|
|
||||||
"""
|
|
||||||
Try to search for related files which belongs to specified File
|
|
||||||
object and pattern. If found, additional File objects are created.
|
|
||||||
|
|
||||||
For example, if we have movie file named like:
|
|
||||||
[aXXo] Batman (1989) [D3ADBEEF].avi
|
|
||||||
[aXXo] Batman (1989) trailer [B00B1337].avi
|
|
||||||
Batman (1989) [D3ADBEEF].avi
|
|
||||||
Batman [D3ADBEEF].avi
|
|
||||||
|
|
||||||
And for example file '[aXXo] Batman (1989) [D3ADBEEF].avi' might have
|
|
||||||
some other accompanied files, like:
|
|
||||||
|
|
||||||
[aXXo] Batman (1989) [D3ADBEEF].avi.conf
|
|
||||||
[aXXo] Batman (1989) [DEADC0DE].nfo
|
|
||||||
[aXXo] Batman (1989) cover [BEEFD00D].jpg
|
|
||||||
[aXXo] Batman (1989) poster [FEEDD00D].jpg
|
|
||||||
|
|
||||||
Which can be atuomatically asociated with the movie.
|
|
||||||
|
|
||||||
This method find such files, and for some of them (currently images)
|
|
||||||
will perform extra actions - like creating corresponding Image objects.
|
|
||||||
|
|
||||||
"""
|
|
||||||
for fname in os.listdir(fobj.filepath):
|
|
||||||
extension = os.path.splitext(fname)[1]
|
|
||||||
if fname.startswith(filename_start) and \
|
|
||||||
extension in ('.jpg', '.gif', '.png'):
|
|
||||||
full_fname = os.path.join(fobj.filepath, fname)
|
|
||||||
LOG.debug('found corresponding image file: %s', full_fname)
|
|
||||||
|
|
||||||
Image(full_fname, self.img_path, fobj, False)
|
|
||||||
|
|
||||||
if not fobj.thumbnail:
|
|
||||||
Thumbnail(full_fname, self.img_path, fobj)
|
|
||||||
|
|
||||||
def _get_all_files(self):
|
def _get_all_files(self):
|
||||||
"""Gather all File objects"""
|
"""Gather all File objects"""
|
||||||
self._existing_files = self._session.query(File).all()
|
self._existing_files = self._session.query(File).all()
|
||||||
@@ -287,13 +254,8 @@ class Scan(object):
|
|||||||
"""
|
"""
|
||||||
fullpath = os.path.join(path, fname)
|
fullpath = os.path.join(path, fname)
|
||||||
|
|
||||||
fname = fname.decode(sys.getfilesystemencoding(),
|
|
||||||
errors="replace")
|
|
||||||
path = path.decode(sys.getfilesystemencoding(),
|
|
||||||
errors="replace")
|
|
||||||
|
|
||||||
if ftype == TYPE['link']:
|
if ftype == TYPE['link']:
|
||||||
fname = fname + " -> " + os.readlink(fullpath).decode('utf-8')
|
fname = fname + " -> " + os.readlink(fullpath)
|
||||||
|
|
||||||
fob = {'filename': fname,
|
fob = {'filename': fname,
|
||||||
'path': path,
|
'path': path,
|
||||||
@@ -378,7 +340,7 @@ class Scan(object):
|
|||||||
LOG.info("Scanning `%s' [%s/%s]", fullpath, self.current_count,
|
LOG.info("Scanning `%s' [%s/%s]", fullpath, self.current_count,
|
||||||
self.files_count)
|
self.files_count)
|
||||||
|
|
||||||
root, dirs, files = os.walk(fullpath).next()
|
root, dirs, files = next(os.walk(fullpath))
|
||||||
for fname in files:
|
for fname in files:
|
||||||
fpath = os.path.join(root, fname)
|
fpath = os.path.join(root, fname)
|
||||||
extension = os.path.splitext(fname)[1]
|
extension = os.path.splitext(fname)[1]
|
||||||
@@ -423,7 +385,7 @@ class Scan(object):
|
|||||||
for dirname in dirs:
|
for dirname in dirs:
|
||||||
dirpath = os.path.join(root, dirname)
|
dirpath = os.path.join(root, dirname)
|
||||||
|
|
||||||
if not os.access(dirpath, os.R_OK|os.X_OK):
|
if not os.access(dirpath, os.R_OK | os.X_OK):
|
||||||
LOG.info("Cannot access directory %s", dirpath)
|
LOG.info("Cannot access directory %s", dirpath)
|
||||||
continue
|
continue
|
||||||
|
|
||||||
@@ -479,17 +441,6 @@ class Scan(object):
|
|||||||
LOG.debug("count of files: %s", count)
|
LOG.debug("count of files: %s", count)
|
||||||
return count
|
return count
|
||||||
|
|
||||||
def _set_image_path(self):
|
|
||||||
"""Get or calculate the images path"""
|
|
||||||
image_path = self._session.query(Config) \
|
|
||||||
.filter(Config.key=="image_path").one()
|
|
||||||
if image_path.value == ":same_as_db:":
|
|
||||||
image_path = pygtktalog.misc.calculate_image_path()
|
|
||||||
else:
|
|
||||||
image_path = pygtktalog.misc.calculate_image_path(image_path.value)
|
|
||||||
|
|
||||||
self.img_path = image_path
|
|
||||||
|
|
||||||
|
|
||||||
def _get_dirsize(path):
|
def _get_dirsize(path):
|
||||||
"""
|
"""
|
||||||
@@ -507,4 +458,3 @@ def _get_dirsize(path):
|
|||||||
os.path.join(root, fname))
|
os.path.join(root, fname))
|
||||||
LOG.debug("_get_dirsize, %s: %d", path, size)
|
LOG.debug("_get_dirsize, %s: %d", path, size)
|
||||||
return size
|
return size
|
||||||
|
|
||||||
@@ -6,17 +6,18 @@
|
|||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
||||||
Created: 2009-04-04
|
Created: 2009-04-04
|
||||||
"""
|
"""
|
||||||
|
import math
|
||||||
import os
|
import os
|
||||||
import shutil
|
import shutil
|
||||||
from tempfile import mkdtemp, mkstemp
|
import tempfile
|
||||||
import math
|
|
||||||
|
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
from pygtktalog.misc import float_to_string
|
|
||||||
from pygtktalog.logger import get_logger
|
from pycatalog.misc import float_to_string
|
||||||
|
from pycatalog.logger import get_logger
|
||||||
|
|
||||||
|
|
||||||
LOG = get_logger("Video")
|
LOG = get_logger()
|
||||||
|
|
||||||
|
|
||||||
class Video(object):
|
class Video(object):
|
||||||
@@ -49,7 +50,7 @@ class Video(object):
|
|||||||
'ID_AUDIO_CODEC': ['audio_codec', self._return_lower],
|
'ID_AUDIO_CODEC': ['audio_codec', self._return_lower],
|
||||||
'ID_AUDIO_FORMAT': ['audio_format', self._return_lower],
|
'ID_AUDIO_FORMAT': ['audio_format', self._return_lower],
|
||||||
'ID_AUDIO_NCH': ['audio_no_channels', int]}
|
'ID_AUDIO_NCH': ['audio_no_channels', int]}
|
||||||
# TODO: what about audio/subtitle language/existence?
|
# TODO: what about audio/subtitle language/existence?
|
||||||
|
|
||||||
for key in output:
|
for key in output:
|
||||||
if key in attrs:
|
if key in attrs:
|
||||||
@@ -58,9 +59,9 @@ class Video(object):
|
|||||||
if 'length' in self.tags and self.tags['length'] > 0:
|
if 'length' in self.tags and self.tags['length'] > 0:
|
||||||
start = self.tags.get('start', 0)
|
start = self.tags.get('start', 0)
|
||||||
length = self.tags['length'] - start
|
length = self.tags['length'] - start
|
||||||
hours = length / 3600
|
hours = length // 3600
|
||||||
seconds = length - hours * 3600
|
seconds = length - hours * 3600
|
||||||
minutes = seconds / 60
|
minutes = seconds // 60
|
||||||
seconds -= minutes * 60
|
seconds -= minutes * 60
|
||||||
length_str = "%02d:%02d:%02d" % (hours, minutes, seconds)
|
length_str = "%02d:%02d:%02d" % (hours, minutes, seconds)
|
||||||
self.tags['duration'] = length_str
|
self.tags['duration'] = length_str
|
||||||
@@ -92,16 +93,16 @@ class Video(object):
|
|||||||
if scale < 1:
|
if scale < 1:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
no_pictures = self.tags['length'] / scale
|
no_pictures = self.tags['length'] // scale
|
||||||
|
|
||||||
if no_pictures > 8:
|
if no_pictures > 8:
|
||||||
no_pictures = (no_pictures / 8) * 8 # only multiple of 8, please.
|
no_pictures = (no_pictures // 8) * 8 # only multiple of 8, please.
|
||||||
else:
|
else:
|
||||||
# for really short movies
|
# for really short movies
|
||||||
no_pictures = 4
|
no_pictures = 4
|
||||||
|
|
||||||
tempdir = mkdtemp()
|
tempdir = tempfile.mkdtemp()
|
||||||
file_desc, image_fn = mkstemp(suffix=".jpg")
|
file_desc, image_fn = tempfile.mkstemp(suffix=".jpg")
|
||||||
os.close(file_desc)
|
os.close(file_desc)
|
||||||
self._make_captures(tempdir, no_pictures)
|
self._make_captures(tempdir, no_pictures)
|
||||||
self._make_montage(tempdir, image_fn, no_pictures)
|
self._make_montage(tempdir, image_fn, no_pictures)
|
||||||
@@ -113,16 +114,16 @@ class Video(object):
|
|||||||
"""
|
"""
|
||||||
Return formatted tags as a string
|
Return formatted tags as a string
|
||||||
"""
|
"""
|
||||||
out_tags = u''
|
out_tags = ''
|
||||||
if 'container' in self.tags:
|
if 'container' in self.tags:
|
||||||
out_tags += u"Container: %s\n" % self.tags['container']
|
out_tags += "Container: %s\n" % self.tags['container']
|
||||||
|
|
||||||
if 'width' in self.tags and 'height' in self.tags:
|
if 'width' in self.tags and 'height' in self.tags:
|
||||||
out_tags += u"Resolution: %sx%s\n" % (self.tags['width'],
|
out_tags += "Resolution: %sx%s\n" % (self.tags['width'],
|
||||||
self.tags['height'])
|
self.tags['height'])
|
||||||
|
|
||||||
if 'duration' in self.tags:
|
if 'duration' in self.tags:
|
||||||
out_tags += u"Duration: %s\n" % self.tags['duration']
|
out_tags += "Duration: %s\n" % self.tags['duration']
|
||||||
|
|
||||||
if 'video_codec' in self.tags:
|
if 'video_codec' in self.tags:
|
||||||
out_tags += "Video codec: %s\n" % self.tags['video_codec']
|
out_tags += "Video codec: %s\n" % self.tags['video_codec']
|
||||||
@@ -178,20 +179,21 @@ class Video(object):
|
|||||||
@directory - full output directory name
|
@directory - full output directory name
|
||||||
@no_pictures - number of pictures to take
|
@no_pictures - number of pictures to take
|
||||||
"""
|
"""
|
||||||
step = float(self.tags['length'] / (no_pictures + 1))
|
step = self.tags['length'] / (no_pictures + 1)
|
||||||
current_time = 0
|
current_time = 0
|
||||||
for dummy in range(1, no_pictures + 1):
|
for dummy in range(1, no_pictures + 1):
|
||||||
current_time += step
|
current_time += step
|
||||||
time = float_to_string(current_time)
|
time = float_to_string(current_time)
|
||||||
cmd = "mplayer \"%s\" -ao null -brightness 0 -hue 0 " \
|
cmd = ('mplayer "%s" -ao null -brightness 0 -hue 0 '
|
||||||
"-saturation 0 -contrast 0 -mc 0 -vf-clr -vo jpeg:outdir=\"%s\" -ss %s" \
|
'-saturation 0 -contrast 0 -mc 0 -vf-clr '
|
||||||
" -frames 1 2>/dev/null"
|
'-vo jpeg:outdir="%s" -ss %s -frames 1 2>/dev/null')
|
||||||
os.popen(cmd % (self.filename, directory, time)).readlines()
|
os.popen(cmd % (self.filename, directory, time)).readlines()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
shutil.move(os.path.join(directory, "00000001.jpg"),
|
shutil.move(os.path.join(directory, "00000001.jpg"),
|
||||||
os.path.join(directory, "picture_%s.jpg" % time))
|
os.path.join(directory, "picture_%s.jpg" % time))
|
||||||
except IOError, (errno, strerror):
|
except IOError as exc:
|
||||||
|
errno, strerror = exc.args
|
||||||
LOG.error('error capturing file from movie "%s" at position '
|
LOG.error('error capturing file from movie "%s" at position '
|
||||||
'%s. Errors: %s, %s', self.filename, time, errno,
|
'%s. Errors: %s, %s', self.filename, time, errno,
|
||||||
strerror)
|
strerror)
|
||||||
@@ -206,7 +208,7 @@ class Video(object):
|
|||||||
@no_pictures - number of pictures
|
@no_pictures - number of pictures
|
||||||
timeit result:
|
timeit result:
|
||||||
python /usr/lib/python2.6/timeit.py -n 1 -r 1 'from \
|
python /usr/lib/python2.6/timeit.py -n 1 -r 1 'from \
|
||||||
pygtktalog.video import Video; v = Video("/home/gryf/t/a.avi"); \
|
pycatalog.video import Video; v = Video("/home/gryf/t/a.avi"); \
|
||||||
v.capture()'
|
v.capture()'
|
||||||
1 loops, best of 1: 18.8 sec per loop
|
1 loops, best of 1: 18.8 sec per loop
|
||||||
"""
|
"""
|
||||||
@@ -216,13 +218,13 @@ class Video(object):
|
|||||||
|
|
||||||
if not (self.tags['width'] * row_length) > self.out_width:
|
if not (self.tags['width'] * row_length) > self.out_width:
|
||||||
for i in [8, 6, 5]:
|
for i in [8, 6, 5]:
|
||||||
if (no_pictures % i) == 0 and \
|
if ((no_pictures % i) == 0 and
|
||||||
(i * self.tags['width']) <= self.out_width:
|
(i * self.tags['width']) <= self.out_width):
|
||||||
row_length = i
|
row_length = i
|
||||||
break
|
break
|
||||||
|
|
||||||
coef = float(self.out_width - row_length - 1) / \
|
coef = (float(self.out_width - row_length - 1) /
|
||||||
(self.tags['width'] * row_length)
|
(self.tags['width'] * row_length))
|
||||||
if coef < 1:
|
if coef < 1:
|
||||||
dim = (int(self.tags['width'] * coef),
|
dim = (int(self.tags['width'] * coef),
|
||||||
int(self.tags['height'] * coef))
|
int(self.tags['height'] * coef))
|
||||||
@@ -231,10 +233,10 @@ class Video(object):
|
|||||||
|
|
||||||
ifn_list = os.listdir(directory)
|
ifn_list = os.listdir(directory)
|
||||||
ifn_list.sort()
|
ifn_list.sort()
|
||||||
img_list = [Image.open(os.path.join(directory, fn)).resize(dim) \
|
img_list = [Image.open(os.path.join(directory, fn)).resize(dim)
|
||||||
for fn in ifn_list]
|
for fn in ifn_list]
|
||||||
|
|
||||||
rows = no_pictures / row_length
|
rows = no_pictures // row_length
|
||||||
cols = row_length
|
cols = row_length
|
||||||
isize = (cols * dim[0] + cols + 1,
|
isize = (cols * dim[0] + cols + 1,
|
||||||
rows * dim[1] + rows + 1)
|
rows * dim[1] + rows + 1)
|
||||||
@@ -250,7 +252,7 @@ class Video(object):
|
|||||||
bbox = (left, upper, right, lower)
|
bbox = (left, upper, right, lower)
|
||||||
try:
|
try:
|
||||||
img = img_list.pop(0)
|
img = img_list.pop(0)
|
||||||
except:
|
except Exception:
|
||||||
break
|
break
|
||||||
inew.paste(img, bbox)
|
inew.paste(img, bbox)
|
||||||
inew.save(image_fn, 'JPEG')
|
inew.save(image_fn, 'JPEG')
|
||||||
@@ -271,7 +273,7 @@ class Video(object):
|
|||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
return int(chain.split(".")[0])
|
return int(chain.split(".")[0])
|
||||||
except:
|
except Exception:
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
@@ -1,58 +0,0 @@
|
|||||||
"""
|
|
||||||
Project: pyGTKtalog
|
|
||||||
Description: Initialization for main module - i18n and so.
|
|
||||||
Type: core
|
|
||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
|
||||||
Created: 2009-05-05
|
|
||||||
"""
|
|
||||||
|
|
||||||
__version__ = "2.0.0"
|
|
||||||
__appname__ = "pyGTKtalog"
|
|
||||||
__copyright__ = u"\u00A9 Roman 'gryf' Dobosz"
|
|
||||||
__summary__ = "%s is simple tool for managing file collections." % __appname__
|
|
||||||
__web__ = "http://github.com/gryf/pygtktalog"
|
|
||||||
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
import locale
|
|
||||||
import gettext
|
|
||||||
import __builtin__
|
|
||||||
|
|
||||||
from pygtktalog.logger import get_logger
|
|
||||||
|
|
||||||
|
|
||||||
__all__ = ['dbcommon',
|
|
||||||
'dbobjects',
|
|
||||||
'dialogs',
|
|
||||||
'logger',
|
|
||||||
'misc']
|
|
||||||
|
|
||||||
GETTEXT_DOMAIN = 'pygtktalog'
|
|
||||||
# There should be message catalogs in "locale" directory placed by setup.py
|
|
||||||
# script. If there is no such directory, let's assume that message catalogs are
|
|
||||||
# placed in system wide location such as /usr/share/locale by Linux
|
|
||||||
# distribution package maintainer.
|
|
||||||
LOCALE_PATH = os.path.join(os.path.abspath(os.path.dirname(__file__)),
|
|
||||||
'locale')
|
|
||||||
|
|
||||||
try:
|
|
||||||
locale.setlocale(locale.LC_ALL, '')
|
|
||||||
except locale.Error:
|
|
||||||
# unknown locale string, fallback to C
|
|
||||||
locale.setlocale(locale.LC_ALL, 'C')
|
|
||||||
|
|
||||||
# for module in gtk.glade, gettext:
|
|
||||||
# if os.path.exists(LOCALE_PATH):
|
|
||||||
# module.bindtextdomain(GETTEXT_DOMAIN, LOCALE_PATH)
|
|
||||||
# else:
|
|
||||||
# module.bindtextdomain(GETTEXT_DOMAIN)
|
|
||||||
# module.textdomain(GETTEXT_DOMAIN)
|
|
||||||
|
|
||||||
# register the gettext function for the whole interpreter as "_"
|
|
||||||
__builtin__._ = gettext.gettext
|
|
||||||
|
|
||||||
# wrap errors into usefull message
|
|
||||||
#def log_exception(exc_type, exc_val, traceback):
|
|
||||||
# get_logger(__name__).error(exc_val)
|
|
||||||
#
|
|
||||||
#sys.excepthook = log_exception
|
|
||||||
@@ -1,248 +0,0 @@
|
|||||||
# -*- coding: utf-8 -*-
|
|
||||||
|
|
||||||
import gtk
|
|
||||||
|
|
||||||
from pygtktalog import logger
|
|
||||||
|
|
||||||
UI = """
|
|
||||||
<ui>
|
|
||||||
|
|
||||||
<menubar name="MenuBar">
|
|
||||||
<menu action="File">
|
|
||||||
<menuitem action="New"/>
|
|
||||||
<menuitem action="Open"/>
|
|
||||||
<menuitem action="Save"/>
|
|
||||||
<menuitem action="Save As"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Import"/>
|
|
||||||
<menuitem action="Export"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Recent"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Quit"/>
|
|
||||||
</menu>
|
|
||||||
<menu action="Edit">
|
|
||||||
<menuitem action="Delete"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Find"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Preferences"/>
|
|
||||||
</menu>
|
|
||||||
<menu action="Catalog">
|
|
||||||
<menuitem action="Add_CD"/>
|
|
||||||
<menuitem action="Add_Dir"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Delete_all_images"/>
|
|
||||||
<menuitem action="Delete_all_thumbnails"/>
|
|
||||||
<menuitem action="Save_all_images"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Catalog_statistics"/>
|
|
||||||
<separator/>
|
|
||||||
<menuitem action="Cancel"/>
|
|
||||||
</menu>
|
|
||||||
<menu action="View">
|
|
||||||
<menuitem action="Toolbar"/>
|
|
||||||
<menuitem action="Statusbar"/>
|
|
||||||
</menu>
|
|
||||||
<menu action="Help">
|
|
||||||
<menuitem action="About"/>
|
|
||||||
</menu>
|
|
||||||
</menubar>
|
|
||||||
|
|
||||||
<toolbar name="ToolBar">
|
|
||||||
<toolitem action="New"/>
|
|
||||||
<toolitem action="Open"/>
|
|
||||||
<toolitem action="Save"/>
|
|
||||||
<separator/>
|
|
||||||
<toolitem action="Add_CD"/>
|
|
||||||
<toolitem action="Add_Dir"/>
|
|
||||||
<toolitem action="Find"/>
|
|
||||||
<separator/>
|
|
||||||
<toolitem action="Cancel"/>
|
|
||||||
<toolitem action="Quit"/>
|
|
||||||
<toolitem action="Debug"/>
|
|
||||||
</toolbar>
|
|
||||||
|
|
||||||
</ui>
|
|
||||||
"""
|
|
||||||
LOG = logger.get_logger(__name__)
|
|
||||||
LOG.setLevel(2)
|
|
||||||
|
|
||||||
|
|
||||||
class ConnectedWidgets(object):
|
|
||||||
"""grouped widgets"""
|
|
||||||
def __init__(self, toolbar, menu):
|
|
||||||
super(ConnectedWidgets, self).__init__()
|
|
||||||
self.toolbar = toolbar
|
|
||||||
self.menu = menu
|
|
||||||
|
|
||||||
def hide(self):
|
|
||||||
self.toolbar.hide()
|
|
||||||
self.menu.hide()
|
|
||||||
|
|
||||||
def show(self):
|
|
||||||
self.toolbar.show()
|
|
||||||
self.menu.show()
|
|
||||||
|
|
||||||
def set_sensitive(self, state):
|
|
||||||
self.toolbar.set_sensitive(state)
|
|
||||||
self.menu.set_sensitive(state)
|
|
||||||
|
|
||||||
|
|
||||||
class MainWindow(object):
|
|
||||||
|
|
||||||
def __init__(self, debug=False):
|
|
||||||
"""Initialize window"""
|
|
||||||
LOG.debug("initialize")
|
|
||||||
self.window = gtk.Window()
|
|
||||||
self.window.set_default_size(650, -1)
|
|
||||||
self.window.set_title("pygtktalog")
|
|
||||||
self.window.connect("delete-event", self.on_quit)
|
|
||||||
|
|
||||||
self.recent = None
|
|
||||||
self.toolbar = None
|
|
||||||
self.statusbar = None
|
|
||||||
self.cancel = None
|
|
||||||
self.debug = None
|
|
||||||
|
|
||||||
vbox = gtk.VBox(False, 0)
|
|
||||||
|
|
||||||
self._setup_menu_toolbar(vbox)
|
|
||||||
|
|
||||||
# TODO:
|
|
||||||
# 1. toolbar with selected tags
|
|
||||||
# 2. main view (splitter)
|
|
||||||
# 3. treeview with tag cloud (left split)
|
|
||||||
# 4. splitter (right split)
|
|
||||||
# 5. file list (upper split)
|
|
||||||
# 6. details w images and thumb (lower split)
|
|
||||||
# 7. status bar (if needed…)
|
|
||||||
|
|
||||||
hbox = gtk.HBox(False, 0)
|
|
||||||
vbox.add(hbox)
|
|
||||||
|
|
||||||
self.window.add(vbox)
|
|
||||||
self.window.show_all()
|
|
||||||
self.debug.hide()
|
|
||||||
|
|
||||||
def fake_recent(self):
|
|
||||||
recent_menu = gtk.Menu()
|
|
||||||
for i in "one two techno foo bar baz".split():
|
|
||||||
item = gtk.MenuItem(i)
|
|
||||||
item.connect_object("activate", self.on_recent,
|
|
||||||
"/some/fake/path/" + i)
|
|
||||||
recent_menu.append(item)
|
|
||||||
item.show()
|
|
||||||
self.recent.set_submenu(recent_menu)
|
|
||||||
|
|
||||||
def _setup_menu_toolbar(self, vbox):
|
|
||||||
"""Create menu/toolbar using uimanager."""
|
|
||||||
actions = [('File', None, '_File'),
|
|
||||||
('New', gtk.STOCK_NEW, '_New', None, 'Create new catalog', self.on_new),
|
|
||||||
('Open', gtk.STOCK_OPEN, '_Open', None, 'Open catalog file', self.on_open),
|
|
||||||
('Save', gtk.STOCK_SAVE, '_Save', None, 'Save catalog file', self.on_save),
|
|
||||||
('Save As', gtk.STOCK_SAVE_AS, '_Save As', None, None, self.on_save),
|
|
||||||
('Import', None, '_Import', None, None, self.on_import),
|
|
||||||
('Export', None, '_Export', None, None, self.on_export),
|
|
||||||
('Recent', None, '_Recent files'),
|
|
||||||
('Quit', gtk.STOCK_QUIT, '_Quit', None, 'Quit the Program', self.on_quit),
|
|
||||||
('Edit', None, '_Edit'),
|
|
||||||
('Delete', gtk.STOCK_DELETE, '_Delete', None, None, self.on_delete),
|
|
||||||
('Find', gtk.STOCK_FIND, '_Find', None, 'Find file', self.on_find),
|
|
||||||
('Preferences', gtk.STOCK_PREFERENCES, '_Preferences'),
|
|
||||||
('Catalog', None, '_Catalog'),
|
|
||||||
('Add_CD', gtk.STOCK_CDROM, '_Add CD', None, 'Add CD/DVD/BR to catalog'),
|
|
||||||
('Add_Dir', gtk.STOCK_DIRECTORY, '_Add Dir', None, 'Add directory to catalog'),
|
|
||||||
('Delete_all_images', None, '_Delete all images'),
|
|
||||||
('Delete_all_thumbnails', None, '_Delete all thumbnails'),
|
|
||||||
('Save_all_images', None, '_Save all images…'),
|
|
||||||
('Catalog_statistics', None, '_Catalog statistics'),
|
|
||||||
('Cancel', gtk.STOCK_CANCEL, '_Cancel'),
|
|
||||||
('View', None, '_View'),
|
|
||||||
('Help', None, '_Help'),
|
|
||||||
('About', gtk.STOCK_ABOUT, '_About'),
|
|
||||||
('Debug', gtk.STOCK_DIALOG_INFO, 'Debug')]
|
|
||||||
|
|
||||||
toggles = [('Toolbar', None, '_Toolbar'),
|
|
||||||
('Statusbar', None, '_Statusbar')]
|
|
||||||
|
|
||||||
mgr = gtk.UIManager()
|
|
||||||
accelgrp = mgr.get_accel_group()
|
|
||||||
self.window.add_accel_group(accelgrp)
|
|
||||||
|
|
||||||
agrp = gtk.ActionGroup("Actions")
|
|
||||||
agrp.add_actions(actions)
|
|
||||||
agrp.add_toggle_actions(toggles)
|
|
||||||
|
|
||||||
mgr.insert_action_group(agrp, 0)
|
|
||||||
mgr.add_ui_from_string(UI)
|
|
||||||
|
|
||||||
help_widget = mgr.get_widget("/MenuBar/Help")
|
|
||||||
help_widget.set_right_justified(True)
|
|
||||||
|
|
||||||
self.recent = mgr.get_widget("/MenuBar/File/Recent")
|
|
||||||
self.fake_recent()
|
|
||||||
|
|
||||||
menubar = mgr.get_widget("/MenuBar")
|
|
||||||
vbox.pack_start(menubar)
|
|
||||||
self.toolbar = mgr.get_widget("/ToolBar")
|
|
||||||
vbox.pack_start(self.toolbar)
|
|
||||||
|
|
||||||
menu_cancel = mgr.get_widget('/MenuBar/Catalog/Cancel')
|
|
||||||
toolbar_cancel = mgr.get_widget('/ToolBar/Cancel')
|
|
||||||
self.cancel = ConnectedWidgets(toolbar_cancel, menu_cancel)
|
|
||||||
self.cancel.set_sensitive(False)
|
|
||||||
|
|
||||||
self.debug = mgr.get_widget('/ToolBar/Debug')
|
|
||||||
|
|
||||||
self.toolbar = mgr.get_widget('/MenuBar/View/Toolbar')
|
|
||||||
self.statusbar = mgr.get_widget('/MenuBar/View/Statusbar')
|
|
||||||
|
|
||||||
def on_new(self, *args, **kwargs):
|
|
||||||
LOG.debug("On new")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_open(self, *args, **kwargs):
|
|
||||||
LOG.debug("On open")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_save(self, *args, **kwargs):
|
|
||||||
LOG.debug("On save")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_save_as(self, *args, **kwargs):
|
|
||||||
LOG.debug("On save as")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_import(self, *args, **kwargs):
|
|
||||||
LOG.debug("On import")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_export(self, *args, **kwargs):
|
|
||||||
LOG.debug("On export")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_recent(self, *args, **kwargs):
|
|
||||||
LOG.debug("On recent")
|
|
||||||
print args, kwargs
|
|
||||||
|
|
||||||
def on_quit(self, *args, **kwargs):
|
|
||||||
LOG.debug("on quit")
|
|
||||||
gtk.main_quit()
|
|
||||||
|
|
||||||
def on_delete(self, *args, **kwargs):
|
|
||||||
LOG.debug("On delete")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_find(self, *args, **kwargs):
|
|
||||||
LOG.debug("On find")
|
|
||||||
return
|
|
||||||
|
|
||||||
def on_about(self, event, menuitem):
|
|
||||||
LOG.debug("about", event, menuitem)
|
|
||||||
return
|
|
||||||
|
|
||||||
|
|
||||||
def run():
|
|
||||||
gui = MainWindow()
|
|
||||||
gtk.mainloop()
|
|
||||||
@@ -1,75 +0,0 @@
|
|||||||
"""
|
|
||||||
Project: pyGTKtalog
|
|
||||||
Description: Misc functions used more than once in src
|
|
||||||
Type: lib
|
|
||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
|
||||||
Created: 2009-04-05
|
|
||||||
"""
|
|
||||||
import os
|
|
||||||
import errno
|
|
||||||
from zlib import crc32
|
|
||||||
|
|
||||||
import pygtktalog.dbcommon
|
|
||||||
from pygtktalog.logger import get_logger
|
|
||||||
|
|
||||||
LOG = get_logger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
def float_to_string(float_length):
|
|
||||||
"""
|
|
||||||
Parse float digit into time string
|
|
||||||
Arguments:
|
|
||||||
@number - digit to be converted into time.
|
|
||||||
Returns HH:MM:SS formatted string
|
|
||||||
"""
|
|
||||||
hour = int(float_length / 3600)
|
|
||||||
float_length -= hour*3600
|
|
||||||
minutes = int(float_length / 60)
|
|
||||||
float_length -= minutes * 60
|
|
||||||
sec = int(float_length)
|
|
||||||
return "%02d:%02d:%02d" % (hour, minutes, sec)
|
|
||||||
|
|
||||||
def calculate_image_path(dbpath=None, create=False):
|
|
||||||
"""Calculate image path out of provided path or using current connection"""
|
|
||||||
if not dbpath:
|
|
||||||
dbpath = pygtktalog.dbcommon.DbFilename
|
|
||||||
if dbpath == ":memory:":
|
|
||||||
raise OSError("Cannot create image path out of in-memory db!")
|
|
||||||
|
|
||||||
dir_, file_ = (os.path.dirname(dbpath), os.path.basename(dbpath))
|
|
||||||
file_base, dummy = os.path.splitext(file_)
|
|
||||||
images_dir = os.path.join(dir_, file_base + "_images")
|
|
||||||
else:
|
|
||||||
if dbpath and "~" in dbpath:
|
|
||||||
dbpath = os.path.expanduser(dbpath)
|
|
||||||
if dbpath and "$" in dbpath:
|
|
||||||
dbpath = os.path.expandvars(dbpath)
|
|
||||||
images_dir = dbpath
|
|
||||||
|
|
||||||
if create:
|
|
||||||
if not os.path.exists(images_dir):
|
|
||||||
try:
|
|
||||||
os.mkdir(images_dir)
|
|
||||||
except OSError, err:
|
|
||||||
if err.errno != errno.EEXIST:
|
|
||||||
raise
|
|
||||||
elif not os.path.exists(images_dir):
|
|
||||||
raise OSError("%s: No such directory" % images_dir)
|
|
||||||
|
|
||||||
return os.path.abspath(images_dir)
|
|
||||||
|
|
||||||
def mk_paths(fname, img_path):
|
|
||||||
"""Make path for provided pathname by calculating crc32 out of file"""
|
|
||||||
with open(fname) as fobj:
|
|
||||||
new_path = "%x" % (crc32(fobj.read(10*1024*1024)) & 0xffffffff)
|
|
||||||
|
|
||||||
new_path = [new_path[i:i + 2] for i in range(0, len(new_path), 2)]
|
|
||||||
full_path = os.path.join(img_path, *new_path[:-1])
|
|
||||||
|
|
||||||
try:
|
|
||||||
os.makedirs(full_path)
|
|
||||||
except OSError as exc:
|
|
||||||
if exc.errno != errno.EEXIST:
|
|
||||||
LOG.debug("Directory %s already exists." % full_path)
|
|
||||||
|
|
||||||
return new_path
|
|
||||||
@@ -1,25 +0,0 @@
|
|||||||
"""
|
|
||||||
Project: pyGTKtalog
|
|
||||||
Description: pyGTK common utility functions
|
|
||||||
Type: tility
|
|
||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
|
||||||
Created: 2010-11-07 13:30:37
|
|
||||||
"""
|
|
||||||
|
|
||||||
def get_tv_item_under_cursor(treeview):
|
|
||||||
"""
|
|
||||||
Get item (most probably id of the row) form tree view under cursor.
|
|
||||||
Arguments:
|
|
||||||
@treeview - gtk.TreeView
|
|
||||||
Returns:
|
|
||||||
Item in first column of TreeModel, which TreeView is connected with,
|
|
||||||
None in other cases
|
|
||||||
"""
|
|
||||||
path, column = treeview.get_cursor()
|
|
||||||
if path and column:
|
|
||||||
model = treeview.get_model()
|
|
||||||
tm_iter = model.get_iter(path)
|
|
||||||
item_id = model.get_value(tm_iter, 0)
|
|
||||||
return item_id
|
|
||||||
return None
|
|
||||||
|
|
||||||
@@ -1,114 +0,0 @@
|
|||||||
"""
|
|
||||||
Project: pyGTKtalog
|
|
||||||
Description: Create thumbnail for sepcified image
|
|
||||||
Type: lib
|
|
||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
|
||||||
Created: 2011-05-15
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
from tempfile import mkstemp
|
|
||||||
import shutil
|
|
||||||
|
|
||||||
from PIL import Image
|
|
||||||
import exifread
|
|
||||||
|
|
||||||
from pygtktalog.logger import get_logger
|
|
||||||
|
|
||||||
|
|
||||||
LOG = get_logger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class ThumbCreator(object):
|
|
||||||
"""
|
|
||||||
Class for generate/extract thumbnail from image file
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, filename):
|
|
||||||
self.thumb_x = 160
|
|
||||||
self.thumb_y = 160
|
|
||||||
self.filename = filename
|
|
||||||
|
|
||||||
def generate(self):
|
|
||||||
"""
|
|
||||||
Save thumbnail into temporary file
|
|
||||||
"""
|
|
||||||
exif = {}
|
|
||||||
orientations = {2: Image.FLIP_LEFT_RIGHT, # Mirrored horizontal
|
|
||||||
3: Image.ROTATE_180, # Rotated 180
|
|
||||||
4: Image.FLIP_TOP_BOTTOM, # Mirrored vertical
|
|
||||||
5: Image.ROTATE_90, # Mirrored horizontal then
|
|
||||||
# rotated 90 CCW
|
|
||||||
6: Image.ROTATE_270, # Rotated 90 CW
|
|
||||||
7: Image.ROTATE_270, # Mirrored horizontal then
|
|
||||||
# rotated 90 CW
|
|
||||||
8: Image.ROTATE_90} # Rotated 90 CCW
|
|
||||||
flips = {7: Image.FLIP_LEFT_RIGHT, 5: Image.FLIP_LEFT_RIGHT}
|
|
||||||
|
|
||||||
exif = self._get_exif()
|
|
||||||
file_desc, thumb_fn = mkstemp(suffix=".jpg")
|
|
||||||
os.close(file_desc)
|
|
||||||
|
|
||||||
if exif and 'JPEGThumbnail' in exif and exif['JPEGThumbnail']:
|
|
||||||
LOG.debug("exif thumb for filename %s" % self.filename)
|
|
||||||
exif_thumbnail = exif['JPEGThumbnail']
|
|
||||||
thumb = open(thumb_fn, 'wb')
|
|
||||||
thumb.write(exif_thumbnail)
|
|
||||||
thumb.close()
|
|
||||||
else:
|
|
||||||
LOG.debug("no exif thumb")
|
|
||||||
if self.is_image_smaller():
|
|
||||||
shutil.copyfile(self.filename, thumb_fn)
|
|
||||||
else:
|
|
||||||
thumb = self._scale_image()
|
|
||||||
if thumb:
|
|
||||||
thumb.save(thumb_fn, "JPEG")
|
|
||||||
|
|
||||||
if exif and 'Image Orientation' in exif:
|
|
||||||
orient = exif['Image Orientation'].values[0]
|
|
||||||
if orient > 1 and orient in orientations:
|
|
||||||
thumb_image = Image.open(thumb_fn)
|
|
||||||
tmp_thumb_img = thumb_image.transpose(orientations[orient])
|
|
||||||
|
|
||||||
if orient in flips:
|
|
||||||
tmp_thumb_img = tmp_thumb_img.transpose(flips[orient])
|
|
||||||
|
|
||||||
tmp_thumb_img.save(thumb_fn, 'JPEG')
|
|
||||||
|
|
||||||
return thumb_fn
|
|
||||||
|
|
||||||
def is_image_smaller(self):
|
|
||||||
"""Check if image is smaller than desired dimention, return boolean"""
|
|
||||||
image = Image.open(self.filename)
|
|
||||||
im_x, im_y = image.size
|
|
||||||
image.close()
|
|
||||||
return im_x <= self.thumb_x and im_y <= self.thumb_y
|
|
||||||
|
|
||||||
def _get_exif(self):
|
|
||||||
"""
|
|
||||||
Get exif (if available), return as a dict
|
|
||||||
"""
|
|
||||||
image_file = open(self.filename, 'rb')
|
|
||||||
try:
|
|
||||||
exif = exifread.process_file(image_file)
|
|
||||||
except Exception:
|
|
||||||
exif = {}
|
|
||||||
LOG.info("Exif crashed on '%s'." % self.filename)
|
|
||||||
finally:
|
|
||||||
image_file.close()
|
|
||||||
|
|
||||||
return exif
|
|
||||||
|
|
||||||
def _scale_image(self):
|
|
||||||
"""
|
|
||||||
Create thumbnail. returns image object or None
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
image_thumb = Image.open(self.filename).convert('RGB')
|
|
||||||
except:
|
|
||||||
return None
|
|
||||||
it_x, it_y = image_thumb.size
|
|
||||||
if it_x > self.thumb_x or it_y > self.thumb_y:
|
|
||||||
image_thumb.thumbnail((self.thumb_x, self.thumb_y),
|
|
||||||
Image.ANTIALIAS)
|
|
||||||
return image_thumb
|
|
||||||
@@ -1,3 +0,0 @@
|
|||||||
Pillow
|
|
||||||
exifread
|
|
||||||
sqlalchemy
|
|
||||||
@@ -1,68 +0,0 @@
|
|||||||
#!/usr/bin/env python
|
|
||||||
"""
|
|
||||||
Project: pyGTKtalog
|
|
||||||
Description: Main gui file launcher
|
|
||||||
Type: UI
|
|
||||||
Author: Roman 'gryf' Dobosz, gryf73@gmail.com
|
|
||||||
Created: 2016-08-19
|
|
||||||
"""
|
|
||||||
import sys
|
|
||||||
import tempfile
|
|
||||||
import os
|
|
||||||
|
|
||||||
from pygtktalog.dbobjects import File, Config
|
|
||||||
from pygtktalog.dbcommon import connect, Session
|
|
||||||
from pygtktalog.gtk2 import gui
|
|
||||||
|
|
||||||
|
|
||||||
class App(object):
|
|
||||||
"""Main app class"""
|
|
||||||
|
|
||||||
def __init__(self, dbname):
|
|
||||||
"""Initialze"""
|
|
||||||
self._dbname = None
|
|
||||||
self.sess = Session()
|
|
||||||
|
|
||||||
if dbname:
|
|
||||||
self._dbname = dbname
|
|
||||||
self.engine = connect(dbname)
|
|
||||||
else:
|
|
||||||
self._create_tmp_db()
|
|
||||||
|
|
||||||
self.root = None
|
|
||||||
self._dbname = dbname
|
|
||||||
|
|
||||||
def _create_tmp_db(self):
|
|
||||||
"""Create temporatry db, untill user decide to save it"""
|
|
||||||
fdsc, self._tmpdb = tempfile.mkstemp()
|
|
||||||
os.close(fdsc)
|
|
||||||
self.engine = connect(self._tmpdb)
|
|
||||||
|
|
||||||
self.root = File()
|
|
||||||
self.root.id = 1
|
|
||||||
self.root.filename = 'root'
|
|
||||||
self.root.size = 0
|
|
||||||
self.root.source = 0
|
|
||||||
self.root.type = 0
|
|
||||||
self.root.parent_id = 1
|
|
||||||
|
|
||||||
config = Config()
|
|
||||||
config.key = "image_path"
|
|
||||||
config.value = ":same_as_db:"
|
|
||||||
|
|
||||||
self.sess.add(self.root)
|
|
||||||
self.sess.add(config)
|
|
||||||
self.sess.commit()
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
"""Initialize gui"""
|
|
||||||
gui.run()
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
db = sys.argv if len(sys.argv) == 2 else None
|
|
||||||
app = App(db)
|
|
||||||
app.run()
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
||||||
42
setup.cfg
Normal file
42
setup.cfg
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
[metadata]
|
||||||
|
name = pycatalog
|
||||||
|
summary = Catalog application for keeping content list of disks and discs
|
||||||
|
description_file = README.rst
|
||||||
|
author = Roman Dobosz
|
||||||
|
author_email = gryf73@gmail.com
|
||||||
|
home_page = https://github.com/gryf/pycatalog
|
||||||
|
license = BSD
|
||||||
|
keywords = catalog, gwhere, collection
|
||||||
|
classifier =
|
||||||
|
Development Status :: 4 - Beta
|
||||||
|
Environment :: Console
|
||||||
|
Intended Audience :: End Users/Desktop
|
||||||
|
License :: OSI Approved :: BSD License
|
||||||
|
Operating System :: POSIX :: Linux
|
||||||
|
Programming Language :: Python
|
||||||
|
Programming Language :: Python :: 3
|
||||||
|
Programming Language :: Python :: 3.8
|
||||||
|
Programming Language :: Python :: 3.9
|
||||||
|
Programming Language :: Python :: 3.10
|
||||||
|
Topic :: Database
|
||||||
|
Topic :: Desktop Environment
|
||||||
|
|
||||||
|
[install]
|
||||||
|
record = install.log
|
||||||
|
|
||||||
|
[options.entry_points]
|
||||||
|
console_scripts =
|
||||||
|
pycatalog = pycatalog:main
|
||||||
|
|
||||||
|
[files]
|
||||||
|
packages =
|
||||||
|
pycatalog
|
||||||
|
|
||||||
|
[options]
|
||||||
|
install_requires =
|
||||||
|
exifread
|
||||||
|
sqlalchemy
|
||||||
|
mutagen
|
||||||
|
|
||||||
|
[bdist_wheel]
|
||||||
|
universal = 1
|
||||||
31
setup.py
31
setup.py
@@ -1,30 +1,5 @@
|
|||||||
#!/usr/bin/env python2
|
#!/usr/bin/env python
|
||||||
"""
|
import setuptools
|
||||||
Setup for the pyGTKtalog project
|
|
||||||
"""
|
|
||||||
from distutils.core import setup
|
|
||||||
|
|
||||||
|
|
||||||
setup(name='pygtktalog',
|
setuptools.setup(setup_requires=['pbr>=2.0.0'], pbr=True)
|
||||||
packages=['pygtktalog'],
|
|
||||||
version='2.0',
|
|
||||||
description='Catalog application with GTK interface',
|
|
||||||
author='Roman Dobosz',
|
|
||||||
author_email='gryf73@gmail.com',
|
|
||||||
url='https://github.com/gryf/pygtktalog',
|
|
||||||
download_url='https://github.com/gryf/pygtktalog.git',
|
|
||||||
keywords=['catalog', 'gwhere', 'where is it', 'collection', 'GTK'],
|
|
||||||
requires=['Pillow', 'sqlalchemy'],
|
|
||||||
scripts=['scripts/cmdcatalog.py'],
|
|
||||||
classifiers=['Programming Language :: Python :: 2',
|
|
||||||
'Programming Language :: Python :: 2.7',
|
|
||||||
'Programming Language :: Python :: 2 :: Only',
|
|
||||||
'Development Status :: 4 - Beta',
|
|
||||||
'Environment :: Console',
|
|
||||||
'Intended Audience :: End Users/Desktop',
|
|
||||||
'License :: OSI Approved :: BSD License',
|
|
||||||
'Operating System :: OS Independent',
|
|
||||||
'Topic :: Multimedia :: Graphics'],
|
|
||||||
long_description=open('README.rst').read(),
|
|
||||||
options={'test': {'verbose': False,
|
|
||||||
'coverage': False}})
|
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
pytest
|
pytest
|
||||||
pytest-cov
|
pytest-cov
|
||||||
pytest-pep8
|
flake8
|
||||||
coverage
|
coverage
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
import unittest
|
import unittest
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from pygtktalog.dbcommon import connect, Meta, Session, Base
|
from pycatalog.dbcommon import connect
|
||||||
|
|
||||||
|
|
||||||
class TestDataBase(unittest.TestCase):
|
class TestDataBase(unittest.TestCase):
|
||||||
|
|||||||
@@ -8,7 +8,7 @@
|
|||||||
import unittest
|
import unittest
|
||||||
import os
|
import os
|
||||||
|
|
||||||
import pygtktalog.misc as pgtkmisc
|
import pycatalog.misc as pgtkmisc
|
||||||
|
|
||||||
|
|
||||||
class TestMiscModule(unittest.TestCase):
|
class TestMiscModule(unittest.TestCase):
|
||||||
|
|||||||
@@ -10,9 +10,9 @@ import shutil
|
|||||||
import tempfile
|
import tempfile
|
||||||
import unittest
|
import unittest
|
||||||
|
|
||||||
from pygtktalog import scan
|
from pycatalog import scan
|
||||||
from pygtktalog.dbobjects import File, Config, Image
|
from pycatalog.dbobjects import File, Config, Image
|
||||||
from pygtktalog.dbcommon import connect, Session
|
from pycatalog.dbcommon import connect, Session
|
||||||
|
|
||||||
|
|
||||||
def populate_with_mock_files(dir_):
|
def populate_with_mock_files(dir_):
|
||||||
@@ -23,7 +23,7 @@ def populate_with_mock_files(dir_):
|
|||||||
files_no = 0
|
files_no = 0
|
||||||
for file_ in files1:
|
for file_ in files1:
|
||||||
with open(os.path.join(dir_, file_), "wb") as fobj:
|
with open(os.path.join(dir_, file_), "wb") as fobj:
|
||||||
fobj.write("\xde\xad\xbe\xef" * len(file_))
|
fobj.write(b"\xde\xad\xbe\xef" * len(file_))
|
||||||
files_no += 1
|
files_no += 1
|
||||||
|
|
||||||
os.symlink(os.path.join(dir_, files1[-1]), os.path.join(dir_, 'link.jpg'))
|
os.symlink(os.path.join(dir_, files1[-1]), os.path.join(dir_, 'link.jpg'))
|
||||||
@@ -32,7 +32,7 @@ def populate_with_mock_files(dir_):
|
|||||||
os.mkdir(os.path.join(dir_, 'directory'))
|
os.mkdir(os.path.join(dir_, 'directory'))
|
||||||
for file_ in files2:
|
for file_ in files2:
|
||||||
with open(os.path.join(dir_, 'directory', file_), "wb") as fobj:
|
with open(os.path.join(dir_, 'directory', file_), "wb") as fobj:
|
||||||
fobj.write("\xfe\xad\xfa\xce" * len(file_))
|
fobj.write(b"\xfe\xad\xfa\xce" * len(file_))
|
||||||
files_no += 1
|
files_no += 1
|
||||||
|
|
||||||
return files_no
|
return files_no
|
||||||
@@ -178,8 +178,8 @@ class TestScan(unittest.TestCase):
|
|||||||
self.assertTrue(file_ob is not file2_ob)
|
self.assertTrue(file_ob is not file2_ob)
|
||||||
|
|
||||||
# While Image objects points to the same file
|
# While Image objects points to the same file
|
||||||
self.assertTrue(file_ob.images[0].filename == \
|
self.assertTrue(file_ob.images[0].filename ==
|
||||||
file2_ob.images[0].filename)
|
file2_ob.images[0].filename)
|
||||||
|
|
||||||
# they are different objects
|
# they are different objects
|
||||||
self.assertTrue(file_ob.images[0] is not file2_ob.images[0])
|
self.assertTrue(file_ob.images[0] is not file2_ob.images[0])
|
||||||
|
|||||||
@@ -7,10 +7,12 @@
|
|||||||
"""
|
"""
|
||||||
import os
|
import os
|
||||||
import unittest
|
import unittest
|
||||||
|
from unittest import mock
|
||||||
|
import io
|
||||||
|
|
||||||
import PIL
|
import PIL
|
||||||
|
|
||||||
from pygtktalog.video import Video
|
from pycatalog.video import Video
|
||||||
|
|
||||||
|
|
||||||
DATA = {"m1.avi": """ID_VIDEO_ID=0
|
DATA = {"m1.avi": """ID_VIDEO_ID=0
|
||||||
@@ -130,7 +132,7 @@ ID_AUDIO_RATE=22050
|
|||||||
ID_AUDIO_NCH=1
|
ID_AUDIO_NCH=1
|
||||||
ID_AUDIO_CODEC=ffac3
|
ID_AUDIO_CODEC=ffac3
|
||||||
ID_EXIT=EOF""",
|
ID_EXIT=EOF""",
|
||||||
"m.wmv":"""ID_AUDIO_ID=1
|
"m.wmv": """ID_AUDIO_ID=1
|
||||||
ID_VIDEO_ID=2
|
ID_VIDEO_ID=2
|
||||||
ID_FILENAME=m.wmv
|
ID_FILENAME=m.wmv
|
||||||
ID_DEMUXER=asf
|
ID_DEMUXER=asf
|
||||||
@@ -198,6 +200,7 @@ class Readlines(object):
|
|||||||
def readlines(self):
|
def readlines(self):
|
||||||
return self.data.split('\n')
|
return self.data.split('\n')
|
||||||
|
|
||||||
|
|
||||||
def mock_popen(command):
|
def mock_popen(command):
|
||||||
key = None
|
key = None
|
||||||
if 'midentify' in command:
|
if 'midentify' in command:
|
||||||
@@ -205,22 +208,25 @@ def mock_popen(command):
|
|||||||
elif 'jpeg:outdir' in command:
|
elif 'jpeg:outdir' in command:
|
||||||
# simulate capture for mplayer
|
# simulate capture for mplayer
|
||||||
img_dir = command.split('"')[-2]
|
img_dir = command.split('"')[-2]
|
||||||
img = PIL.Image.new('RGBA', (320, 200))
|
img = PIL.Image.new('RGB', (320, 200))
|
||||||
with open(os.path.join(img_dir, "00000001.jpg"), "wb") as fobj:
|
with open(os.path.join(img_dir, "00000001.jpg"), "wb") as fobj:
|
||||||
img.save(fobj)
|
img.save(fobj)
|
||||||
|
|
||||||
return Readlines(key)
|
return Readlines(key)
|
||||||
|
|
||||||
|
|
||||||
os.popen = mock_popen
|
# os.popen = mock_popen
|
||||||
|
|
||||||
|
|
||||||
class TestVideo(unittest.TestCase):
|
class TestVideo(unittest.TestCase):
|
||||||
"""test class for retrive midentify script output"""
|
"""test class for retrive midentify script output"""
|
||||||
|
|
||||||
def test_avi(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_avi(self, popen):
|
||||||
"""test mock avi file, should return dict with expected values"""
|
"""test mock avi file, should return dict with expected values"""
|
||||||
avi = Video("m.avi")
|
fname = "m.avi"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
avi = Video(fname)
|
||||||
self.assertTrue(len(avi.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(avi.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertEqual(avi.tags['audio_format'], '85')
|
self.assertEqual(avi.tags['audio_format'], '85')
|
||||||
self.assertEqual(avi.tags['width'], 128)
|
self.assertEqual(avi.tags['width'], 128)
|
||||||
@@ -233,10 +239,13 @@ class TestVideo(unittest.TestCase):
|
|||||||
self.assertEqual(avi.tags['duration'], '00:00:04')
|
self.assertEqual(avi.tags['duration'], '00:00:04')
|
||||||
self.assertEqual(avi.tags['container'], 'avi')
|
self.assertEqual(avi.tags['container'], 'avi')
|
||||||
|
|
||||||
def test_avi2(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_avi2(self, popen):
|
||||||
"""test another mock avi file, should return dict with expected
|
"""test another mock avi file, should return dict with expected
|
||||||
values"""
|
values"""
|
||||||
avi = Video("m1.avi")
|
fname = "m1.avi"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
avi = Video(fname)
|
||||||
self.assertTrue(len(avi.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(avi.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertEqual(avi.tags['audio_format'], '85')
|
self.assertEqual(avi.tags['audio_format'], '85')
|
||||||
self.assertEqual(avi.tags['width'], 128)
|
self.assertEqual(avi.tags['width'], 128)
|
||||||
@@ -249,9 +258,12 @@ class TestVideo(unittest.TestCase):
|
|||||||
self.assertEqual(avi.tags['duration'], '00:00:04')
|
self.assertEqual(avi.tags['duration'], '00:00:04')
|
||||||
self.assertEqual(avi.tags['container'], 'avi')
|
self.assertEqual(avi.tags['container'], 'avi')
|
||||||
|
|
||||||
def test_mkv(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_mkv(self, popen):
|
||||||
"""test mock mkv file, should return dict with expected values"""
|
"""test mock mkv file, should return dict with expected values"""
|
||||||
mkv = Video("m.mkv")
|
fname = "m.mkv"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
mkv = Video(fname)
|
||||||
self.assertTrue(len(mkv.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(mkv.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertEqual(mkv.tags['audio_format'], '8192')
|
self.assertEqual(mkv.tags['audio_format'], '8192')
|
||||||
self.assertEqual(mkv.tags['width'], 128)
|
self.assertEqual(mkv.tags['width'], 128)
|
||||||
@@ -264,24 +276,30 @@ class TestVideo(unittest.TestCase):
|
|||||||
self.assertEqual(mkv.tags['duration'], '00:00:04')
|
self.assertEqual(mkv.tags['duration'], '00:00:04')
|
||||||
self.assertTrue(mkv.tags['container'] in ('mkv', 'lavfpref'))
|
self.assertTrue(mkv.tags['container'] in ('mkv', 'lavfpref'))
|
||||||
|
|
||||||
def test_mpg(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_mpg(self, popen):
|
||||||
"""test mock mpg file, should return dict with expected values"""
|
"""test mock mpg file, should return dict with expected values"""
|
||||||
mpg = Video("m.mpg")
|
fname = "m.mpg"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
mpg = Video(fname)
|
||||||
self.assertTrue(len(mpg.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(mpg.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertFalse(mpg.tags.has_key('audio_format'))
|
self.assertFalse('audio_format' in mpg.tags)
|
||||||
self.assertEqual(mpg.tags['width'], 128)
|
self.assertEqual(mpg.tags['width'], 128)
|
||||||
self.assertFalse(mpg.tags.has_key('audio_no_channels'))
|
self.assertFalse('audio_no_channels' in mpg.tags)
|
||||||
self.assertEqual(mpg.tags['height'], 96)
|
self.assertEqual(mpg.tags['height'], 96)
|
||||||
self.assertEqual(mpg.tags['video_format'], '0x10000001')
|
self.assertEqual(mpg.tags['video_format'], '0x10000001')
|
||||||
self.assertFalse(mpg.tags.has_key('lenght'))
|
self.assertFalse('lenght' in mpg.tags)
|
||||||
self.assertFalse(mpg.tags.has_key('audio_codec'))
|
self.assertFalse('audio_codec' in mpg.tags)
|
||||||
self.assertEqual(mpg.tags['video_codec'], 'ffmpeg1')
|
self.assertEqual(mpg.tags['video_codec'], 'ffmpeg1')
|
||||||
self.assertFalse(mpg.tags.has_key('duration'))
|
self.assertFalse('duration' in mpg.tags)
|
||||||
self.assertEqual(mpg.tags['container'], 'mpeges')
|
self.assertEqual(mpg.tags['container'], 'mpeges')
|
||||||
|
|
||||||
def test_ogm(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_ogm(self, popen):
|
||||||
"""test mock ogm file, should return dict with expected values"""
|
"""test mock ogm file, should return dict with expected values"""
|
||||||
ogm = Video("m.ogm")
|
fname = "m.ogm"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
ogm = Video(fname)
|
||||||
self.assertTrue(len(ogm.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(ogm.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertEqual(ogm.tags['audio_format'], '8192')
|
self.assertEqual(ogm.tags['audio_format'], '8192')
|
||||||
self.assertEqual(ogm.tags['width'], 160)
|
self.assertEqual(ogm.tags['width'], 160)
|
||||||
@@ -294,9 +312,12 @@ class TestVideo(unittest.TestCase):
|
|||||||
self.assertEqual(ogm.tags['duration'], '00:00:04')
|
self.assertEqual(ogm.tags['duration'], '00:00:04')
|
||||||
self.assertTrue(ogm.tags['container'] in ('ogg', 'lavfpref'))
|
self.assertTrue(ogm.tags['container'] in ('ogg', 'lavfpref'))
|
||||||
|
|
||||||
def test_wmv(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_wmv(self, popen):
|
||||||
"""test mock wmv file, should return dict with expected values"""
|
"""test mock wmv file, should return dict with expected values"""
|
||||||
wmv = Video("m.wmv")
|
fname = "m.wmv"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
wmv = Video(fname)
|
||||||
self.assertTrue(len(wmv.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(wmv.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertEqual(wmv.tags['audio_format'], '353')
|
self.assertEqual(wmv.tags['audio_format'], '353')
|
||||||
self.assertEqual(wmv.tags['width'], 852)
|
self.assertEqual(wmv.tags['width'], 852)
|
||||||
@@ -309,9 +330,12 @@ class TestVideo(unittest.TestCase):
|
|||||||
self.assertEqual(wmv.tags['duration'], '01:17:32')
|
self.assertEqual(wmv.tags['duration'], '01:17:32')
|
||||||
self.assertEqual(wmv.tags['container'], 'asf')
|
self.assertEqual(wmv.tags['container'], 'asf')
|
||||||
|
|
||||||
def test_mp4(self):
|
@mock.patch('os.popen')
|
||||||
|
def test_mp4(self, popen):
|
||||||
"""test mock mp4 file, should return dict with expected values"""
|
"""test mock mp4 file, should return dict with expected values"""
|
||||||
mp4 = Video("m.mp4")
|
fname = "m.mp4"
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
mp4 = Video(fname)
|
||||||
self.assertTrue(len(mp4.tags) != 0, "result should have lenght > 0")
|
self.assertTrue(len(mp4.tags) != 0, "result should have lenght > 0")
|
||||||
self.assertEqual(mp4.tags['audio_format'], 'mp4a')
|
self.assertEqual(mp4.tags['audio_format'], 'mp4a')
|
||||||
self.assertEqual(mp4.tags['width'], 720)
|
self.assertEqual(mp4.tags['width'], 720)
|
||||||
@@ -324,21 +348,31 @@ class TestVideo(unittest.TestCase):
|
|||||||
self.assertEqual(mp4.tags['duration'], '00:01:09')
|
self.assertEqual(mp4.tags['duration'], '00:01:09')
|
||||||
self.assertEqual(mp4.tags['container'], 'lavfpref')
|
self.assertEqual(mp4.tags['container'], 'lavfpref')
|
||||||
|
|
||||||
def test_capture(self):
|
@mock.patch('shutil.move')
|
||||||
|
@mock.patch('pycatalog.video.Image')
|
||||||
|
@mock.patch('os.listdir')
|
||||||
|
@mock.patch('shutil.rmtree')
|
||||||
|
@mock.patch('os.close')
|
||||||
|
@mock.patch('tempfile.mkstemp')
|
||||||
|
@mock.patch('tempfile.mkdtemp')
|
||||||
|
@mock.patch('os.popen')
|
||||||
|
def test_capture(self, popen, mkdtemp, mkstemp, fclose, rmtree, listdir,
|
||||||
|
img, move):
|
||||||
"""test capture with some small movie and play a little with tags"""
|
"""test capture with some small movie and play a little with tags"""
|
||||||
avi = Video("m.avi")
|
fname = 'm.avi'
|
||||||
|
popen.return_value = io.StringIO(DATA[fname])
|
||||||
|
mkdtemp.return_value = '/tmp'
|
||||||
|
mkstemp.return_value = (10, 'foo.jpg')
|
||||||
|
listdir.return_value = ['a.jpg', 'b.jpg', 'c.jpg', 'd.jpg']
|
||||||
|
|
||||||
|
avi = Video(fname)
|
||||||
filename = avi.capture()
|
filename = avi.capture()
|
||||||
self.assertTrue(filename != None)
|
self.assertIsNotNone(filename)
|
||||||
self.assertTrue(os.path.exists(filename))
|
|
||||||
file_size = os.stat(filename)[6]
|
|
||||||
self.assertAlmostEqual(file_size/10000.0, 0.151, 0)
|
|
||||||
os.unlink(filename)
|
|
||||||
|
|
||||||
for length in (480, 380, 4):
|
for length in (480, 380, 4):
|
||||||
avi.tags['length'] = length
|
avi.tags['length'] = length
|
||||||
filename = avi.capture()
|
filename = avi.capture()
|
||||||
self.assertTrue(filename is not None)
|
self.assertTrue(filename is not None)
|
||||||
os.unlink(filename)
|
|
||||||
|
|
||||||
avi.tags['length'] = 3
|
avi.tags['length'] = 3
|
||||||
self.assertTrue(avi.capture() is None)
|
self.assertTrue(avi.capture() is None)
|
||||||
@@ -351,9 +385,8 @@ class TestVideo(unittest.TestCase):
|
|||||||
avi.tags['width'] = 1025
|
avi.tags['width'] = 1025
|
||||||
filename = avi.capture()
|
filename = avi.capture()
|
||||||
self.assertTrue(filename is not None)
|
self.assertTrue(filename is not None)
|
||||||
os.unlink(filename)
|
|
||||||
|
|
||||||
del(avi.tags['length'])
|
del avi.tags['length']
|
||||||
self.assertTrue(avi.capture() is None)
|
self.assertTrue(avi.capture() is None)
|
||||||
|
|
||||||
self.assertTrue(len(str(avi)) > 0)
|
self.assertTrue(len(str(avi)) > 0)
|
||||||
|
|||||||
7
tox.ini
7
tox.ini
@@ -1,18 +1,19 @@
|
|||||||
[tox]
|
[tox]
|
||||||
envlist = cleanup,py27,pep8
|
envlist = cleanup,py3,pep8
|
||||||
|
|
||||||
usedevelop = True
|
usedevelop = True
|
||||||
|
|
||||||
[testenv]
|
[testenv]
|
||||||
|
basepython = python3
|
||||||
usedevelop=True
|
usedevelop=True
|
||||||
setenv = COVERAGE_FILE = .coverage
|
setenv = COVERAGE_FILE = .coverage
|
||||||
commands = py.test --cov=pygtktalog --cov-report=term-missing
|
commands = py.test --cov=pycatalog --cov-report=term-missing
|
||||||
deps = -r{toxinidir}/requirements.txt
|
deps = -r{toxinidir}/requirements.txt
|
||||||
-r{toxinidir}/test-requirements.txt
|
-r{toxinidir}/test-requirements.txt
|
||||||
|
|
||||||
[testenv:pep8]
|
[testenv:pep8]
|
||||||
usedevelop=True
|
usedevelop=True
|
||||||
commands = py.test --pep8 -m pep8
|
commands = flake8
|
||||||
deps = -r{toxinidir}/requirements.txt
|
deps = -r{toxinidir}/requirements.txt
|
||||||
-r{toxinidir}/test-requirements.txt
|
-r{toxinidir}/test-requirements.txt
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user