mirror of
https://github.com/robweber/xbmcbackup.git
synced 2024-12-22 14:05:23 +01:00
Merge branch 'dropbox'
Conflicts: addon.xml changelog.txt resources/lib/backup.py resources/settings.xml
This commit is contained in:
commit
4ff0e49d82
@ -5,7 +5,7 @@ I've had to recover my database, thumbnails, and source configuration enough tim
|
||||
|
||||
Usage:
|
||||
|
||||
In the addon settings you can define a remote path for the destination of your xbmc files. Each backup will create a folder named in a month, day, year format so you can create multiple backups. You can keep a set number of backups by setting the integer value of the Backups to Keep setting greater than 0.
|
||||
In the addon settings you can define a remote path for the destination of your xbmc files. Each backup will create a folder named in a YYYYMMDD format so you can create multiple backups. You can keep a set number of backups by setting the integer value of the Backups to Keep setting greater than 0.
|
||||
|
||||
On the Backup Selection page you can select which items from your user profile folder will be sent to the backup location. By default all are turned on except the Addon Data directory.
|
||||
|
||||
@ -15,6 +15,10 @@ Scheduling:
|
||||
|
||||
You can also schedule backups to be completed on a set interval via the scheduling area. When it is time for the backup to run it will be executed in the background.
|
||||
|
||||
Using Dropbox:
|
||||
|
||||
Using Dropbox as a storage target adds a few steps the first time you wish to run a backup. XBMC Backup needs to have permission to access your Dropbox account. When you see the prompt regarding the Dropbox URL Authorization DO NOT click OK. Check your XBMC log file for a line from "script.xbmcbackup" containing the authorization URL. Cut/paste this into a browser and click Allow. Once this is done you can click "OK" in XBMC and proceed as normal. XBMC Backup will cache the authorization code so you only have to do this once, or if you revoke the Dropbox permissions.
|
||||
|
||||
What this Addon Will Not Do:
|
||||
|
||||
This is not meant as an XBMC file sync solution. If you have multiple frontends you want to keep in sync this addon may work in a "poor man's" sort of way but it is not intended for that.
|
||||
|
@ -1,6 +1,6 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||
<addon id="script.xbmcbackup"
|
||||
name="XBMC Backup" version="0.2.3" provider-name="robweber">
|
||||
name="XBMC Backup" version="0.3.0" provider-name="robweber">
|
||||
<requires>
|
||||
<import addon="xbmc.python" version="2.0"/>
|
||||
</requires>
|
||||
@ -9,8 +9,10 @@
|
||||
</extension>
|
||||
<extension point="xbmc.service" library="scheduler.py" start="startup" />
|
||||
<extension point="xbmc.addon.metadata">
|
||||
<summary lang="fr">Sauvegarder et restaurer vos bases de données XBMC et vos fichiers de configuration en cas de crash ou de fichiers corrompus.</summary>
|
||||
<summary lang="de">Die XBMC Datenbank sichern und bei Dateiverlust oder Beschädigung wiederherstellen.</summary>
|
||||
<summary lang="en">Backup and restore your XBMC database and configuration files in the event of a crash or file corruption.</summary>
|
||||
<description lang="fr">Avez-vous déjà perdu votre configuration XBMC et espéré avoir fait une sauvegarde ? Maintenant, vous pouvez le faire en un simple click. Vous pouvez exporter vos bases de données, playlists, miniatures, addons et autres fichiers de configuration vers n'importe quel endroit accessible depuis XBMC.</description>
|
||||
<summary lang="es">Haz copia de seguridad de tu base de datos y configuración y recupera todo en caso de fallo.</summary>
|
||||
<summary lang="es_MX">Respalda y restaura tu base de datos y archivos de configuración de XBMC dado el evento de un cuelgue o corrupción de archivos. </summary>
|
||||
<summary lang="fr">Sauvegarder et restaurer vos bases de données XBMC et vos fichiers de configuration en cas de crash ou de fichiers corrompus.</summary>
|
||||
@ -21,7 +23,7 @@
|
||||
<summary lang="sk">Zálohovanie a obnova XBMC databázy a konfiguračných súborov pre prípad havárie alebo poškodenia súboru.</summary>
|
||||
<summary lang="sv">Ta backupp av eller återställ din XBMC-databas och konfigurationsfiler i händelse av en krash eller filkorruption.</summary>
|
||||
<description lang="de">Jemals deine XBMC Konfiguration zerschossen und dir dann gewünscht, dass ein Backup existiert? Jetzt kannst du eine Sicherung mit nur einem Klick erzeugen. Du kannst deine Datenbanen, Playlisten, Thumbnails, Addons und andere Details zu einem Ort deiner Wahl sichern.</description>
|
||||
<description lang="en">Ever hosed your XBMC configuration and wished you'd had a backup? Now you can with one easy click. You can export your database, playlist, thumbnails, addons and other configuration details to any source writeable by XBMC. Backups can be run on demand or via a scheduler. </description>
|
||||
<description lang="en">Ever hosed your XBMC configuration and wished you'd had a backup? Now you can with one easy click. You can export your database, playlist, thumbnails, addons and other configuration details to any source writeable by XBMC or directly to Dropbox cloud storage. Backups can be run on demand or via a scheduler. </description>
|
||||
<description lang="es_MX">¿Alguna vez haz echado a perder tu configuración de XBMC y haz deseado tener un respaldo? Ahora puedes tenerlo con un simple click. Puedes exportar tu base de datos, listas de reproducción, miniaturas, addons y otros detalles de configuración correspondientes a cualquier fuente que pueda escribir XBMC. Los respaldos pueden ser efectuados a pedido o mediante una programación temporal</description>
|
||||
<description lang="fr">Avez-vous déjà perdu votre configuration XBMC et espéré avoir fait une sauvegarde ? Maintenant, vous pouvez le faire en un simple click. Vous pouvez exporter vos bases de données, playlists, miniatures, addons et autres fichiers de configuration vers n'importe quel endroit accessible depuis XBMC.</description>
|
||||
<description lang="he">Ever hosed your XBMC configuration and wished you'd had a backup? Now you can with one easy click. You can export your database, playlist, thumbnails, addons and other configuration details to any source writeable by XBMC. Backups can be run on demand or via a scheduler. </description>
|
||||
|
@ -1,3 +1,8 @@
|
||||
Version 0.3.0
|
||||
|
||||
major vfs rewrite
|
||||
Added Dropbox as storage target
|
||||
|
||||
Version 0.2.3
|
||||
|
||||
first official frodo build
|
||||
|
@ -16,6 +16,7 @@
|
||||
<string id="30024">Type Remote Path</string>
|
||||
<string id="30025">Remote Path Type</string>
|
||||
<string id="30026">Backups to keep (0 for all)</string>
|
||||
<string id="30027">Dropbox</string>
|
||||
|
||||
<string id="30030">User Addons</string>
|
||||
<string id="30031">Addon Data</string>
|
||||
@ -34,6 +35,8 @@
|
||||
<string id="30052">Writing file</string>
|
||||
<string id="30053">Starting scheduled backup</string>
|
||||
<string id="30054">Removing backup</string>
|
||||
<string id="30056">Check log for Dropbox authorize URL</string>
|
||||
<string id="30057">Click OK when authorized</string>
|
||||
|
||||
<string id="30060">Enable Scheduler</string>
|
||||
<string id="30061">Schedule</string>
|
||||
|
@ -2,18 +2,19 @@ import xbmc
|
||||
import xbmcgui
|
||||
import xbmcvfs
|
||||
import utils as utils
|
||||
import os
|
||||
import os.path
|
||||
import time
|
||||
from vfs import XBMCFileSystem,DropboxFileSystem
|
||||
|
||||
class FileManager:
|
||||
walk_path = ''
|
||||
fileArray = None
|
||||
verbose_log = False
|
||||
not_dir = ['.zip','.xsp','.rar']
|
||||
vfs = None
|
||||
|
||||
def __init__(self,path):
|
||||
self.walk_path = path
|
||||
|
||||
def __init__(self,vfs):
|
||||
self.vfs = vfs
|
||||
|
||||
def createFileList(self):
|
||||
self.fileArray = []
|
||||
self.verbose_log = utils.getSetting("verbose_log") == 'true'
|
||||
@ -21,48 +22,48 @@ class FileManager:
|
||||
#figure out which syncing options to run
|
||||
if(utils.getSetting('backup_addons') == 'true'):
|
||||
self.addFile("-addons")
|
||||
self.walkTree(self.walk_path + "addons/")
|
||||
self.walkTree(self.vfs.root_path + "addons/")
|
||||
|
||||
self.addFile("-userdata")
|
||||
|
||||
if(utils.getSetting('backup_addon_data') == 'true'):
|
||||
self.addFile("-userdata/addon_data")
|
||||
self.walkTree(self.walk_path + "userdata/addon_data/")
|
||||
self.walkTree(self.vfs.root_path + "userdata/addon_data/")
|
||||
|
||||
if(utils.getSetting('backup_database') == 'true'):
|
||||
self.addFile("-userdata/Database")
|
||||
self.walkTree(self.walk_path + "userdata/Database")
|
||||
self.walkTree(self.vfs.root_path + "userdata/Database")
|
||||
|
||||
if(utils.getSetting("backup_playlists") == 'true'):
|
||||
self.addFile("-userdata/playlists")
|
||||
self.walkTree(self.walk_path + "userdata/playlists")
|
||||
self.walkTree(self.vfs.root_path + "userdata/playlists")
|
||||
|
||||
if(utils.getSetting("backup_thumbnails") == "true"):
|
||||
self.addFile("-userdata/Thumbnails")
|
||||
self.walkTree(self.walk_path + "userdata/Thumbnails")
|
||||
self.walkTree(self.vfs.root_path + "userdata/Thumbnails")
|
||||
|
||||
if(utils.getSetting("backup_config") == "true"):
|
||||
self.addFile("-userdata/keymaps")
|
||||
self.walkTree(self.walk_path + "userdata/keymaps")
|
||||
self.walkTree(self.vfs.root_path + "userdata/keymaps")
|
||||
|
||||
self.addFile("-userdata/peripheral_data")
|
||||
self.walkTree(self.walk_path + "userdata/peripheral_data")
|
||||
self.walkTree(self.vfs.root_path + "userdata/peripheral_data")
|
||||
|
||||
#this part is an oddity
|
||||
dirs,configFiles = xbmcvfs.listdir(self.walk_path + "userdata/")
|
||||
dirs,configFiles = self.vfs.listdir(self.vfs.root_path + "userdata/")
|
||||
for aFile in configFiles:
|
||||
if(aFile.endswith(".xml")):
|
||||
self.addFile("userdata/" + aFile)
|
||||
|
||||
def walkTree(self,directory):
|
||||
dirs,files = xbmcvfs.listdir(directory)
|
||||
dirs,files = self.vfs.listdir(directory)
|
||||
|
||||
#create all the subdirs first
|
||||
for aDir in dirs:
|
||||
dirPath = xbmc.translatePath(directory + "/" + aDir)
|
||||
file_ext = aDir.split('.')[-1]
|
||||
|
||||
self.addFile("-" + dirPath[len(self.walk_path):].decode("UTF-8"))
|
||||
self.addFile("-" + dirPath[len(self.vfs.root_path):].decode("UTF-8"))
|
||||
#catch for "non directory" type files
|
||||
if (not any(file_ext in s for s in self.not_dir)):
|
||||
self.walkTree(dirPath)
|
||||
@ -70,7 +71,7 @@ class FileManager:
|
||||
#copy all the files
|
||||
for aFile in files:
|
||||
filePath = xbmc.translatePath(directory + "/" + aFile)
|
||||
self.addFile(filePath[len(self.walk_path):].decode("UTF-8"))
|
||||
self.addFile(filePath[len(self.vfs.root_path):].decode("UTF-8"))
|
||||
|
||||
def addFile(self,filename):
|
||||
#write the full remote path name of this file
|
||||
@ -84,10 +85,12 @@ class XbmcBackup:
|
||||
#constants for initiating a back or restore
|
||||
Backup = 0
|
||||
Restore = 1
|
||||
|
||||
#remote file system
|
||||
vfs = None
|
||||
|
||||
local_path = ''
|
||||
remote_root = ''
|
||||
remote_path = ''
|
||||
local_vfs = None
|
||||
remote_vfs = None
|
||||
restoreFile = None
|
||||
|
||||
#for the progress bar
|
||||
@ -98,51 +101,55 @@ class XbmcBackup:
|
||||
fileManager = None
|
||||
|
||||
def __init__(self):
|
||||
self.local_path = xbmc.makeLegalFilename(xbmc.translatePath("special://home"),False);
|
||||
|
||||
if(utils.getSetting('remote_selection') == '1'):
|
||||
self.remote_root = utils.getSetting('remote_path_2')
|
||||
utils.setSetting("remote_path","")
|
||||
elif(utils.getSetting('remote_selection') == '0'):
|
||||
self.remote_root = utils.getSetting("remote_path")
|
||||
self.local_vfs = XBMCFileSystem()
|
||||
self.local_vfs.set_root(xbmc.translatePath("special://home"))
|
||||
|
||||
#fix slashes
|
||||
self.remote_root = self.remote_root.replace("\\","/")
|
||||
|
||||
#check if trailing slash is included
|
||||
if(self.remote_root[-1:] != "/"):
|
||||
self.remote_root = self.remote_root + "/"
|
||||
self.configureVFS()
|
||||
|
||||
utils.log(utils.getString(30046))
|
||||
|
||||
def configureVFS(self):
|
||||
if(utils.getSetting('remote_selection') == '1'):
|
||||
self.remote_vfs = XBMCFileSystem()
|
||||
self.remote_vfs.set_root(utils.getSetting('remote_path_2'))
|
||||
utils.setSetting("remote_path","")
|
||||
elif(utils.getSetting('remote_selection') == '0'):
|
||||
self.remote_vfs = XBMCFileSystem()
|
||||
self.remote_vfs.set_root(utils.getSetting("remote_path"))
|
||||
elif(utils.getSetting('remote_selection') == '2'):
|
||||
self.remote_vfs = DropboxFileSystem()
|
||||
self.remote_vfs.set_root('/')
|
||||
|
||||
def run(self,mode=-1,runSilent=False):
|
||||
|
||||
#append backup folder name
|
||||
progressBarTitle = utils.getString(30010) + " - "
|
||||
if(mode == self.Backup and self.remote_root != ''):
|
||||
self.remote_path = self.remote_root + time.strftime("%Y%m%d") + "/"
|
||||
progressBarTitle = progressBarTitle + utils.getString(30016)
|
||||
elif(mode == self.Restore and utils.getSetting("backup_name") != '' and self.remote_root != ''):
|
||||
self.remote_path = self.remote_root + utils.getSetting("backup_name") + "/"
|
||||
progressBarTitle = progressBarTitle + utils.getString(30017)
|
||||
else:
|
||||
self.remote_path = ""
|
||||
|
||||
#check if we should use the progress bar
|
||||
#check if we should use the progress bar
|
||||
if(utils.getSetting('run_silent') == 'false' and not runSilent):
|
||||
self.progressBar = xbmcgui.DialogProgress()
|
||||
self.progressBar.create(progressBarTitle,utils.getString(30049) + "......")
|
||||
self.progressBar.create(utils.getString(30010),utils.getString(30049) + "......")
|
||||
|
||||
utils.log(utils.getString(30047) + ": " + self.local_path)
|
||||
utils.log(utils.getString(30048) + ": " + self.remote_path)
|
||||
#determine backup mode
|
||||
if(mode == -1):
|
||||
mode = int(utils.getSetting('addon_mode'))
|
||||
|
||||
#append backup folder name
|
||||
remote_base_path = ""
|
||||
if(mode == self.Backup and self.remote_vfs.root_path != ''):
|
||||
#capture base path for backup rotation
|
||||
remote_base_path = self.remote_vfs.set_root(self.remote_vfs.root_path + time.strftime("%Y%m%d") + "/")
|
||||
elif(mode == self.Restore and utils.getSetting("backup_name") != '' and self.remote_vfs.root_path != ''):
|
||||
self.remote_vfs.set_root(self.remote_vfs.root_path + utils.getSetting("backup_name") + "/")
|
||||
else:
|
||||
self.remote_vfs = None
|
||||
|
||||
utils.log(utils.getString(30047) + ": " + self.local_vfs.root_path)
|
||||
utils.log(utils.getString(30048) + ": " + self.remote_vfs.root_path)
|
||||
|
||||
#run the correct mode
|
||||
if(mode == self.Backup):
|
||||
utils.log(utils.getString(30023) + " - " + utils.getString(30016))
|
||||
self.fileManager = FileManager(self.local_path)
|
||||
self.fileManager = FileManager(self.local_vfs)
|
||||
|
||||
#for backups check if remote path exists
|
||||
if(xbmcvfs.exists(self.remote_path)):
|
||||
if(self.remote_vfs.exists(self.remote_vfs.root_path)):
|
||||
#this will fail - need a disclaimer here
|
||||
utils.log(utils.getString(30050))
|
||||
|
||||
@ -152,7 +159,7 @@ class XbmcBackup:
|
||||
total_backups = int(utils.getSetting('backup_rotation'))
|
||||
if(total_backups > 0):
|
||||
|
||||
dirs,files = xbmcvfs.listdir(self.remote_root)
|
||||
dirs,files = self.remote_vfs.listdir(remote_base_path)
|
||||
if(len(dirs) > total_backups):
|
||||
#remove backups to equal total wanted
|
||||
dirs.sort()
|
||||
@ -163,19 +170,19 @@ class XbmcBackup:
|
||||
while(remove_num >= 0 and not self.checkCancel()):
|
||||
self.updateProgress(utils.getString(30054) + " " + dirs[remove_num])
|
||||
utils.log("Removing backup " + dirs[remove_num])
|
||||
xbmcvfs.rmdir(self.remote_root + dirs[remove_num] + "/",True)
|
||||
self.remote_vfs.rmdir(remote_base_path + dirs[remove_num] + "/")
|
||||
remove_num = remove_num - 1
|
||||
|
||||
|
||||
else:
|
||||
utils.log(utils.getString(30023) + " - " + utils.getString(30017))
|
||||
self.fileManager = FileManager(self.remote_path)
|
||||
self.fileManager = FileManager(self.remote_vfs)
|
||||
|
||||
#for restores remote path must exist
|
||||
if(xbmcvfs.exists(self.remote_path)):
|
||||
if(self.remote_vfs.exists(self.remote_vfs.root_path)):
|
||||
self.restoreFiles()
|
||||
else:
|
||||
xbmcgui.Dialog().ok(utils.getString(30010),utils.getString(30045),self.remote_path)
|
||||
xbmcgui.Dialog().ok(utils.getString(30010),utils.getString(30045),self.remote_vfs.root_path)
|
||||
|
||||
if(utils.getSetting('run_silent') == 'false' and not runSilent):
|
||||
self.progressBar.close()
|
||||
@ -183,7 +190,7 @@ class XbmcBackup:
|
||||
def syncFiles(self):
|
||||
|
||||
#make the remote directory
|
||||
xbmcvfs.mkdir(self.remote_path)
|
||||
self.remote_vfs.mkdir(self.remote_vfs.root_path)
|
||||
|
||||
utils.log(utils.getString(30051))
|
||||
self.fileManager.createFileList()
|
||||
@ -191,7 +198,7 @@ class XbmcBackup:
|
||||
allFiles = self.fileManager.getFileList()
|
||||
|
||||
#write list from local to remote
|
||||
self.writeFiles(allFiles,self.local_path,self.remote_path)
|
||||
self.writeFiles(allFiles,self.local_vfs,self.remote_vfs)
|
||||
|
||||
def restoreFiles(self):
|
||||
self.fileManager.createFileList()
|
||||
@ -200,25 +207,30 @@ class XbmcBackup:
|
||||
allFiles = self.fileManager.getFileList()
|
||||
|
||||
#write list from remote to local
|
||||
self.writeFiles(allFiles,self.remote_path,self.local_path)
|
||||
self.writeFiles(allFiles,self.remote_vfs,self.local_vfs)
|
||||
|
||||
#call update addons to refresh everything
|
||||
xbmc.executebuiltin('UpdateLocalAddons')
|
||||
|
||||
def writeFiles(self,fileList,source,dest):
|
||||
utils.log("Writing files to: " + dest)
|
||||
utils.log("Writing files to: " + dest.root_path)
|
||||
self.filesTotal = len(fileList)
|
||||
self.filesLeft = self.filesTotal
|
||||
|
||||
#write each file from source to destination
|
||||
for aFile in fileList:
|
||||
if(not self.checkCancel()):
|
||||
utils.log('Writing file: ' + source + aFile,xbmc.LOGDEBUG)
|
||||
utils.log('Writing file: ' + source.root_path + aFile,xbmc.LOGDEBUG)
|
||||
self.updateProgress(aFile)
|
||||
if (aFile.startswith("-")):
|
||||
xbmcvfs.mkdir(xbmc.makeLegalFilename(dest + aFile[1:],False))
|
||||
dest.mkdir(dest.root_path + aFile[1:])
|
||||
else:
|
||||
xbmcvfs.copy(xbmc.makeLegalFilename(source + aFile),xbmc.makeLegalFilename(dest + aFile,False))
|
||||
if(isinstance(source,DropboxFileSystem)):
|
||||
#if copying from dropbox we need the file handle, use get_file
|
||||
source.get_file(source.root_path + aFile,dest.root_path + aFile)
|
||||
else:
|
||||
#copy using normal method
|
||||
dest.put(source.root_path + aFile,dest.root_path + aFile)
|
||||
|
||||
def updateProgress(self,message=''):
|
||||
self.filesLeft = self.filesLeft - 1
|
||||
@ -236,4 +248,4 @@ class XbmcBackup:
|
||||
return result
|
||||
|
||||
def isReady(self):
|
||||
return True if self.remote_root != '' else False
|
||||
return True if self.remote_vfs != None else False
|
||||
|
2
resources/lib/dropbox/.gitignore
vendored
Normal file
2
resources/lib/dropbox/.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
|
||||
*.pyc
|
3
resources/lib/dropbox/__init__.py
Normal file
3
resources/lib/dropbox/__init__.py
Normal file
@ -0,0 +1,3 @@
|
||||
from __future__ import absolute_import
|
||||
|
||||
from . import client, rest, session
|
965
resources/lib/dropbox/client.py
Normal file
965
resources/lib/dropbox/client.py
Normal file
@ -0,0 +1,965 @@
|
||||
"""
|
||||
The main client API you'll be working with most often. You'll need to
|
||||
configure a dropbox.session.DropboxSession for this to work, but otherwise
|
||||
it's fairly self-explanatory.
|
||||
|
||||
Before you can begin making requests to the dropbox API, you have to
|
||||
authenticate your application with Dropbox and get the user to
|
||||
authorize your application to use dropbox on his behalf. A typical
|
||||
progam, from the initial imports to making a simple request (``account_info``),
|
||||
looks like this:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
# Include the Dropbox SDK libraries
|
||||
from dropbox import client, rest, session
|
||||
|
||||
# Get your app key and secret from the Dropbox developer website
|
||||
APP_KEY = 'INSERT_APP_KEY_HERE'
|
||||
APP_SECRET = 'INSERT_SECRET_HERE'
|
||||
|
||||
# ACCESS_TYPE should be 'dropbox' or 'app_folder' as configured for your app
|
||||
ACCESS_TYPE = 'INSERT_ACCESS_TYPE_HERE'
|
||||
|
||||
sess = session.DropboxSession(APP_KEY, APP_SECRET, ACCESS_TYPE)
|
||||
|
||||
request_token = sess.obtain_request_token()
|
||||
|
||||
url = sess.build_authorize_url(request_token)
|
||||
|
||||
# Make the user sign in and authorize this token
|
||||
print "url:", url
|
||||
print "Please visit this website and press the 'Allow' button, then hit 'Enter' here."
|
||||
raw_input()
|
||||
|
||||
# This will fail if the user didn't visit the above URL and hit 'Allow'
|
||||
access_token = sess.obtain_access_token(request_token)
|
||||
|
||||
client = client.DropboxClient(sess)
|
||||
print "linked account:", client.account_info()
|
||||
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import re
|
||||
import os
|
||||
from StringIO import StringIO
|
||||
try:
|
||||
import json
|
||||
except ImportError:
|
||||
import simplejson as json
|
||||
|
||||
from .rest import ErrorResponse, RESTClient
|
||||
|
||||
def format_path(path):
|
||||
"""Normalize path for use with the Dropbox API.
|
||||
|
||||
This function turns multiple adjacent slashes into single
|
||||
slashes, then ensures that there's a leading slash but
|
||||
not a trailing slash.
|
||||
"""
|
||||
if not path:
|
||||
return path
|
||||
|
||||
path = re.sub(r'/+', '/', path)
|
||||
|
||||
if path == '/':
|
||||
return (u"" if isinstance(path, unicode) else "")
|
||||
else:
|
||||
return '/' + path.strip('/')
|
||||
|
||||
class DropboxClient(object):
|
||||
"""
|
||||
The main access point of doing REST calls on Dropbox. You should
|
||||
first create and configure a dropbox.session.DropboxSession object,
|
||||
and then pass it into DropboxClient's constructor. DropboxClient
|
||||
then does all the work of properly calling each API method
|
||||
with the correct OAuth authentication.
|
||||
|
||||
You should be aware that any of these methods can raise a
|
||||
rest.ErrorResponse exception if the server returns a non-200
|
||||
or invalid HTTP response. Note that a 401 return status at any
|
||||
point indicates that the user needs to be reauthenticated.
|
||||
"""
|
||||
|
||||
def __init__(self, session, rest_client=RESTClient):
|
||||
"""Initialize the DropboxClient object.
|
||||
|
||||
Args:
|
||||
``session``: A dropbox.session.DropboxSession object to use for making requests.
|
||||
``rest_client``: A dropbox.rest.RESTClient-like object to use for making requests. [optional]
|
||||
"""
|
||||
self.session = session
|
||||
self.rest_client = rest_client
|
||||
|
||||
def request(self, target, params=None, method='POST', content_server=False):
|
||||
"""Make an HTTP request to a target API method.
|
||||
|
||||
This is an internal method used to properly craft the url, headers, and
|
||||
params for a Dropbox API request. It is exposed for you in case you
|
||||
need craft other API calls not in this library or if you want to debug it.
|
||||
|
||||
Args:
|
||||
- ``target``: The target URL with leading slash (e.g. '/files')
|
||||
- ``params``: A dictionary of parameters to add to the request
|
||||
- ``method``: An HTTP method (e.g. 'GET' or 'POST')
|
||||
- ``content_server``: A boolean indicating whether the request is to the
|
||||
API content server, for example to fetch the contents of a file
|
||||
rather than its metadata.
|
||||
|
||||
Returns:
|
||||
- A tuple of (url, params, headers) that should be used to make the request.
|
||||
OAuth authentication information will be added as needed within these fields.
|
||||
"""
|
||||
assert method in ['GET','POST', 'PUT'], "Only 'GET', 'POST', and 'PUT' are allowed."
|
||||
if params is None:
|
||||
params = {}
|
||||
|
||||
host = self.session.API_CONTENT_HOST if content_server else self.session.API_HOST
|
||||
base = self.session.build_url(host, target)
|
||||
headers, params = self.session.build_access_headers(method, base, params)
|
||||
|
||||
if method in ('GET', 'PUT'):
|
||||
url = self.session.build_url(host, target, params)
|
||||
else:
|
||||
url = self.session.build_url(host, target)
|
||||
|
||||
return url, params, headers
|
||||
|
||||
|
||||
def account_info(self):
|
||||
"""Retrieve information about the user's account.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing account information.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#account-info
|
||||
"""
|
||||
url, params, headers = self.request("/account/info", method='GET')
|
||||
|
||||
return self.rest_client.GET(url, headers)
|
||||
|
||||
def get_chunked_uploader(self, file_obj, length):
|
||||
"""Creates a ChunkedUploader to upload the given file-like object.
|
||||
|
||||
Args:
|
||||
- ``file_obj``: The file-like object which is the source of the data
|
||||
being uploaded.
|
||||
- ``length``: The number of bytes to upload.
|
||||
|
||||
The expected use of this function is as follows:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
bigFile = open("data.txt", 'rb')
|
||||
|
||||
uploader = myclient.get_chunked_uploader(bigFile, size)
|
||||
print "uploading: ", size
|
||||
while uploader.offset < size:
|
||||
try:
|
||||
upload = uploader.upload_chunked()
|
||||
except rest.ErrorResponse, e:
|
||||
# perform error handling and retry logic
|
||||
|
||||
The SDK leaves the error handling and retry logic to the developer
|
||||
to implement, as the exact requirements will depend on the application
|
||||
involved.
|
||||
"""
|
||||
return DropboxClient.ChunkedUploader(self, file_obj, length)
|
||||
|
||||
|
||||
|
||||
|
||||
class ChunkedUploader(object):
|
||||
"""Contains the logic around a chunked upload, which uploads a
|
||||
large file to Dropbox via the /chunked_upload endpoint
|
||||
"""
|
||||
def __init__(self, client, file_obj, length):
|
||||
self.client = client
|
||||
self.offset = 0
|
||||
self.upload_id = None
|
||||
|
||||
self.last_block = None
|
||||
self.file_obj = file_obj
|
||||
self.target_length = length
|
||||
|
||||
|
||||
def upload_chunked(self, chunk_size = 4 * 1024 * 1024):
|
||||
"""Uploads data from this ChunkedUploader's file_obj in chunks, until
|
||||
an error occurs. Throws an exception when an error occurs, and can
|
||||
be called again to resume the upload.
|
||||
|
||||
Args:
|
||||
- ``chunk_size``: The number of bytes to put in each chunk. [default 4 MB]
|
||||
"""
|
||||
|
||||
while self.offset < self.target_length:
|
||||
next_chunk_size = min(chunk_size, self.target_length - self.offset)
|
||||
if self.last_block == None:
|
||||
self.last_block = self.file_obj.read(next_chunk_size)
|
||||
|
||||
try:
|
||||
(self.offset, self.upload_id) = self.client.upload_chunk(StringIO(self.last_block), next_chunk_size, self.offset, self.upload_id)
|
||||
self.last_block = None
|
||||
except ErrorResponse, e:
|
||||
reply = e.body
|
||||
if "offset" in reply and reply['offset'] != 0:
|
||||
if reply['offset'] > self.offset:
|
||||
self.last_block = None
|
||||
self.offset = reply['offset']
|
||||
|
||||
def finish(self, path, overwrite=False, parent_rev=None):
|
||||
"""Commits the bytes uploaded by this ChunkedUploader to a file
|
||||
in the users dropbox.
|
||||
|
||||
Args:
|
||||
- ``path``: The full path of the file in the Dropbox.
|
||||
- ``overwrite``: Whether to overwrite an existing file at the given path. [default False]
|
||||
If overwrite is False and a file already exists there, Dropbox
|
||||
will rename the upload to make sure it doesn't overwrite anything.
|
||||
You need to check the metadata returned for the new name.
|
||||
This field should only be True if your intent is to potentially
|
||||
clobber changes to a file that you don't know about.
|
||||
- ``parent_rev``: The rev field from the 'parent' of this upload. [optional]
|
||||
If your intent is to update the file at the given path, you should
|
||||
pass the parent_rev parameter set to the rev value from the most recent
|
||||
metadata you have of the existing file at that path. If the server
|
||||
has a more recent version of the file at the specified path, it will
|
||||
automatically rename your uploaded file, spinning off a conflict.
|
||||
Using this parameter effectively causes the overwrite parameter to be ignored.
|
||||
The file will always be overwritten if you send the most-recent parent_rev,
|
||||
and it will never be overwritten if you send a less-recent one.
|
||||
"""
|
||||
|
||||
path = "/commit_chunked_upload/%s%s" % (self.client.session.root, format_path(path))
|
||||
|
||||
params = dict(
|
||||
overwrite = bool(overwrite),
|
||||
upload_id = self.upload_id
|
||||
)
|
||||
|
||||
if parent_rev is not None:
|
||||
params['parent_rev'] = parent_rev
|
||||
|
||||
url, params, headers = self.client.request(path, params, content_server=True)
|
||||
|
||||
return self.client.rest_client.POST(url, params, headers)
|
||||
|
||||
def upload_chunk(self, file_obj, length, offset=0, upload_id=None):
|
||||
"""Uploads a single chunk of data from the given file like object. The majority of users
|
||||
should use the ChunkedUploader object, which provides a simpler interface to the
|
||||
chunked_upload API endpoint.
|
||||
|
||||
Args:
|
||||
- ``file_obj``: The source of the data to upload
|
||||
- ``length``: The number of bytes to upload in one chunk.
|
||||
|
||||
Returns:
|
||||
- The reply from the server, as a dictionary
|
||||
"""
|
||||
|
||||
params = dict()
|
||||
|
||||
if upload_id:
|
||||
params['upload_id'] = upload_id
|
||||
params['offset'] = offset
|
||||
|
||||
url, ignored_params, headers = self.request("/chunked_upload", params, method='PUT', content_server=True)
|
||||
|
||||
try:
|
||||
reply = self.rest_client.PUT(url, file_obj, headers)
|
||||
return reply['offset'], reply['upload_id']
|
||||
except ErrorResponse, e:
|
||||
raise e
|
||||
|
||||
|
||||
def put_file(self, full_path, file_obj, overwrite=False, parent_rev=None):
|
||||
"""Upload a file.
|
||||
|
||||
A typical use case would be as follows:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
f = open('working-draft.txt')
|
||||
response = client.put_file('/magnum-opus.txt', f)
|
||||
print "uploaded:", response
|
||||
|
||||
which would return the metadata of the uploaded file, similar to:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
'bytes': 77,
|
||||
'icon': 'page_white_text',
|
||||
'is_dir': False,
|
||||
'mime_type': 'text/plain',
|
||||
'modified': 'Wed, 20 Jul 2011 22:04:50 +0000',
|
||||
'path': '/magnum-opus.txt',
|
||||
'rev': '362e2029684fe',
|
||||
'revision': 221922,
|
||||
'root': 'dropbox',
|
||||
'size': '77 bytes',
|
||||
'thumb_exists': False
|
||||
}
|
||||
|
||||
Args:
|
||||
- ``full_path``: The full path to upload the file to, *including the file name*.
|
||||
If the destination directory does not yet exist, it will be created.
|
||||
- ``file_obj``: A file-like object to upload. If you would like, you can pass a string as file_obj.
|
||||
- ``overwrite``: Whether to overwrite an existing file at the given path. [default False]
|
||||
If overwrite is False and a file already exists there, Dropbox
|
||||
will rename the upload to make sure it doesn't overwrite anything.
|
||||
You need to check the metadata returned for the new name.
|
||||
This field should only be True if your intent is to potentially
|
||||
clobber changes to a file that you don't know about.
|
||||
- ``parent_rev``: The rev field from the 'parent' of this upload. [optional]
|
||||
If your intent is to update the file at the given path, you should
|
||||
pass the parent_rev parameter set to the rev value from the most recent
|
||||
metadata you have of the existing file at that path. If the server
|
||||
has a more recent version of the file at the specified path, it will
|
||||
automatically rename your uploaded file, spinning off a conflict.
|
||||
Using this parameter effectively causes the overwrite parameter to be ignored.
|
||||
The file will always be overwritten if you send the most-recent parent_rev,
|
||||
and it will never be overwritten if you send a less-recent one.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the newly uploaded file.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#files-put
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 503: User over quota
|
||||
|
||||
Note: In Python versions below version 2.6, httplib doesn't handle file-like objects.
|
||||
In that case, this code will read the entire file into memory (!).
|
||||
"""
|
||||
path = "/files_put/%s%s" % (self.session.root, format_path(full_path))
|
||||
|
||||
params = {
|
||||
'overwrite': bool(overwrite),
|
||||
}
|
||||
|
||||
if parent_rev is not None:
|
||||
params['parent_rev'] = parent_rev
|
||||
|
||||
|
||||
url, params, headers = self.request(path, params, method='PUT', content_server=True)
|
||||
|
||||
return self.rest_client.PUT(url, file_obj, headers)
|
||||
|
||||
def get_file(self, from_path, rev=None):
|
||||
"""Download a file.
|
||||
|
||||
Unlike most other calls, get_file returns a raw HTTPResponse with the connection open.
|
||||
You should call .read() and perform any processing you need, then close the HTTPResponse.
|
||||
|
||||
A typical usage looks like this:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
out = open('magnum-opus.txt', 'w')
|
||||
f, metadata = client.get_file_and_metadata('/magnum-opus.txt').read()
|
||||
out.write(f)
|
||||
|
||||
which would download the file ``magnum-opus.txt`` and write the contents into
|
||||
the file ``magnum-opus.txt`` on the local filesystem.
|
||||
|
||||
Args:
|
||||
- ``from_path``: The path to the file to be downloaded.
|
||||
- ``rev``: A previous rev value of the file to be downloaded. [optional]
|
||||
|
||||
Returns:
|
||||
- An httplib.HTTPResponse that is the result of the request.
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at the given path, or the file that was there was deleted.
|
||||
- 200: Request was okay but response was malformed in some way.
|
||||
"""
|
||||
path = "/files/%s%s" % (self.session.root, format_path(from_path))
|
||||
|
||||
params = {}
|
||||
if rev is not None:
|
||||
params['rev'] = rev
|
||||
|
||||
url, params, headers = self.request(path, params, method='GET', content_server=True)
|
||||
return self.rest_client.request("GET", url, headers=headers, raw_response=True)
|
||||
|
||||
def get_file_and_metadata(self, from_path, rev=None):
|
||||
"""Download a file alongwith its metadata.
|
||||
|
||||
Acts as a thin wrapper around get_file() (see get_file() comments for
|
||||
more details)
|
||||
|
||||
Args:
|
||||
- ``from_path``: The path to the file to be downloaded.
|
||||
- ``rev``: A previous rev value of the file to be downloaded. [optional]
|
||||
|
||||
Returns:
|
||||
- An httplib.HTTPResponse that is the result of the request.
|
||||
- A dictionary containing the metadata of the file (see
|
||||
https://www.dropbox.com/developers/reference/api#metadata for details).
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at the given path, or the file that was there was deleted.
|
||||
- 200: Request was okay but response was malformed in some way.
|
||||
"""
|
||||
file_res = self.get_file(from_path, rev)
|
||||
metadata = DropboxClient.__parse_metadata_as_dict(file_res)
|
||||
|
||||
return file_res, metadata
|
||||
|
||||
@staticmethod
|
||||
def __parse_metadata_as_dict(dropbox_raw_response):
|
||||
"""Parses file metadata from a raw dropbox HTTP response, raising a
|
||||
dropbox.rest.ErrorResponse if parsing fails.
|
||||
"""
|
||||
metadata = None
|
||||
for header, header_val in dropbox_raw_response.getheaders():
|
||||
if header.lower() == 'x-dropbox-metadata':
|
||||
try:
|
||||
metadata = json.loads(header_val)
|
||||
except ValueError:
|
||||
raise ErrorResponse(dropbox_raw_response)
|
||||
if not metadata: raise ErrorResponse(dropbox_raw_response)
|
||||
return metadata
|
||||
|
||||
def delta(self, cursor=None):
|
||||
"""A way of letting you keep up with changes to files and folders in a
|
||||
user's Dropbox. You can periodically call delta() to get a list of "delta
|
||||
entries", which are instructions on how to update your local state to
|
||||
match the server's state.
|
||||
|
||||
Arguments:
|
||||
- ``cursor``: On the first call, omit this argument (or pass in ``None``). On
|
||||
subsequent calls, pass in the ``cursor`` string returned by the previous
|
||||
call.
|
||||
|
||||
Returns: A dict with three fields.
|
||||
- ``entries``: A list of "delta entries" (described below)
|
||||
- ``reset``: If ``True``, you should your local state to be an empty folder
|
||||
before processing the list of delta entries. This is only ``True`` only
|
||||
in rare situations.
|
||||
- ``cursor``: A string that is used to keep track of your current state.
|
||||
On the next call to delta(), pass in this value to return entries
|
||||
that were recorded since the cursor was returned.
|
||||
- ``has_more``: If ``True``, then there are more entries available; you can
|
||||
call delta() again immediately to retrieve those entries. If ``False``,
|
||||
then wait at least 5 minutes (preferably longer) before checking again.
|
||||
|
||||
Delta Entries: Each entry is a 2-item list of one of following forms:
|
||||
- [*path*, *metadata*]: Indicates that there is a file/folder at the given
|
||||
path. You should add the entry to your local path. (The *metadata*
|
||||
value is the same as what would be returned by the ``metadata()`` call.)
|
||||
|
||||
- If the new entry includes parent folders that don't yet exist in your
|
||||
local state, create those parent folders in your local state. You
|
||||
will eventually get entries for those parent folders.
|
||||
- If the new entry is a file, replace whatever your local state has at
|
||||
*path* with the new entry.
|
||||
- If the new entry is a folder, check what your local state has at
|
||||
*path*. If it's a file, replace it with the new entry. If it's a
|
||||
folder, apply the new *metadata* to the folder, but do not modify
|
||||
the folder's children.
|
||||
- [*path*, ``nil``]: Indicates that there is no file/folder at the *path* on
|
||||
Dropbox. To update your local state to match, delete whatever is at *path*,
|
||||
including any children (you will sometimes also get "delete" delta entries
|
||||
for the children, but this is not guaranteed). If your local state doesn't
|
||||
have anything at *path*, ignore this entry.
|
||||
|
||||
Remember: Dropbox treats file names in a case-insensitive but case-preserving
|
||||
way. To facilitate this, the *path* strings above are lower-cased versions of
|
||||
the actual path. The *metadata* dicts have the original, case-preserved path.
|
||||
"""
|
||||
path = "/delta"
|
||||
|
||||
params = {}
|
||||
if cursor is not None:
|
||||
params['cursor'] = cursor
|
||||
|
||||
url, params, headers = self.request(path, params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
|
||||
def create_copy_ref(self, from_path):
|
||||
"""Creates and returns a copy ref for a specific file. The copy ref can be
|
||||
used to instantly copy that file to the Dropbox of another account.
|
||||
|
||||
Args:
|
||||
- ``path``: The path to the file for a copy ref to be created on.
|
||||
|
||||
Returns:
|
||||
- A dictionary that looks like the following example:
|
||||
|
||||
``{"expires":"Fri, 31 Jan 2042 21:01:05 +0000", "copy_ref":"z1X6ATl6aWtzOGq0c3g5Ng"}``
|
||||
|
||||
"""
|
||||
path = "/copy_ref/%s%s" % (self.session.root, format_path(from_path))
|
||||
|
||||
url, params, headers = self.request(path, {}, method='GET')
|
||||
|
||||
return self.rest_client.GET(url, headers)
|
||||
|
||||
def add_copy_ref(self, copy_ref, to_path):
|
||||
"""Adds the file referenced by the copy ref to the specified path
|
||||
|
||||
Args:
|
||||
- ``copy_ref``: A copy ref string that was returned from a create_copy_ref call.
|
||||
The copy_ref can be created from any other Dropbox account, or from the same account.
|
||||
- ``path``: The path to where the file will be created.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the new copy of the file.
|
||||
"""
|
||||
path = "/fileops/copy"
|
||||
|
||||
params = {'from_copy_ref': copy_ref,
|
||||
'to_path': format_path(to_path),
|
||||
'root': self.session.root}
|
||||
|
||||
url, params, headers = self.request(path, params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
def file_copy(self, from_path, to_path):
|
||||
"""Copy a file or folder to a new location.
|
||||
|
||||
Args:
|
||||
- ``from_path``: The path to the file or folder to be copied.
|
||||
- ``to_path``: The destination path of the file or folder to be copied.
|
||||
This parameter should include the destination filename (e.g.
|
||||
from_path: '/test.txt', to_path: '/dir/test.txt'). If there's
|
||||
already a file at the to_path, this copy will be renamed to
|
||||
be unique.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the new copy of the file or folder.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#fileops-copy
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of:
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at given from_path.
|
||||
- 503: User over storage quota.
|
||||
"""
|
||||
params = {'root': self.session.root,
|
||||
'from_path': format_path(from_path),
|
||||
'to_path': format_path(to_path),
|
||||
}
|
||||
|
||||
url, params, headers = self.request("/fileops/copy", params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
|
||||
def file_create_folder(self, path):
|
||||
"""Create a folder.
|
||||
|
||||
Args:
|
||||
- ``path``: The path of the new folder.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the newly created folder.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#fileops-create-folder
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 403: A folder at that path already exists.
|
||||
"""
|
||||
params = {'root': self.session.root, 'path': format_path(path)}
|
||||
|
||||
url, params, headers = self.request("/fileops/create_folder", params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
|
||||
def file_delete(self, path):
|
||||
"""Delete a file or folder.
|
||||
|
||||
Args:
|
||||
- ``path``: The path of the file or folder.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the just deleted file.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#fileops-delete
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at the given path.
|
||||
"""
|
||||
params = {'root': self.session.root, 'path': format_path(path)}
|
||||
|
||||
url, params, headers = self.request("/fileops/delete", params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
|
||||
def file_move(self, from_path, to_path):
|
||||
"""Move a file or folder to a new location.
|
||||
|
||||
Args:
|
||||
- ``from_path``: The path to the file or folder to be moved.
|
||||
- ``to_path``: The destination path of the file or folder to be moved.
|
||||
This parameter should include the destination filename (e.g.
|
||||
- ``from_path``: '/test.txt', to_path: '/dir/test.txt'). If there's
|
||||
already a file at the to_path, this file or folder will be renamed to
|
||||
be unique.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the new copy of the file or folder.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#fileops-move
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at given from_path.
|
||||
- 503: User over storage quota.
|
||||
"""
|
||||
params = {'root': self.session.root, 'from_path': format_path(from_path), 'to_path': format_path(to_path)}
|
||||
|
||||
url, params, headers = self.request("/fileops/move", params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
|
||||
def metadata(self, path, list=True, file_limit=25000, hash=None, rev=None, include_deleted=False):
|
||||
"""Retrieve metadata for a file or folder.
|
||||
|
||||
A typical use would be:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
folder_metadata = client.metadata('/')
|
||||
print "metadata:", folder_metadata
|
||||
|
||||
which would return the metadata of the root directory. This
|
||||
will look something like:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
{
|
||||
'bytes': 0,
|
||||
'contents': [
|
||||
{
|
||||
'bytes': 0,
|
||||
'icon': 'folder',
|
||||
'is_dir': True,
|
||||
'modified': 'Thu, 25 Aug 2011 00:03:15 +0000',
|
||||
'path': '/Sample Folder',
|
||||
'rev': '803beb471',
|
||||
'revision': 8,
|
||||
'root': 'dropbox',
|
||||
'size': '0 bytes',
|
||||
'thumb_exists': False
|
||||
},
|
||||
{
|
||||
'bytes': 77,
|
||||
'icon': 'page_white_text',
|
||||
'is_dir': False,
|
||||
'mime_type': 'text/plain',
|
||||
'modified': 'Wed, 20 Jul 2011 22:04:50 +0000',
|
||||
'path': '/magnum-opus.txt',
|
||||
'rev': '362e2029684fe',
|
||||
'revision': 221922,
|
||||
'root': 'dropbox',
|
||||
'size': '77 bytes',
|
||||
'thumb_exists': False
|
||||
}
|
||||
],
|
||||
'hash': 'efdac89c4da886a9cece1927e6c22977',
|
||||
'icon': 'folder',
|
||||
'is_dir': True,
|
||||
'path': '/',
|
||||
'root': 'app_folder',
|
||||
'size': '0 bytes',
|
||||
'thumb_exists': False
|
||||
}
|
||||
|
||||
In this example, the root directory contains two things: ``Sample Folder``,
|
||||
which is a folder, and ``/magnum-opus.txt``, which is a text file 77 bytes long
|
||||
|
||||
Args:
|
||||
- ``path``: The path to the file or folder.
|
||||
- ``list``: Whether to list all contained files (only applies when
|
||||
path refers to a folder).
|
||||
- ``file_limit``: The maximum number of file entries to return within
|
||||
a folder. If the number of files in the directory exceeds this
|
||||
limit, an exception is raised. The server will return at max
|
||||
25,000 files within a folder.
|
||||
- ``hash``: Every directory listing has a hash parameter attached that
|
||||
can then be passed back into this function later to save on\
|
||||
bandwidth. Rather than returning an unchanged folder's contents,\
|
||||
the server will instead return a 304.\
|
||||
- ``rev``: The revision of the file to retrieve the metadata for. [optional]
|
||||
This parameter only applies for files. If omitted, you'll receive
|
||||
the most recent revision metadata.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the file or folder
|
||||
(and contained files if appropriate).
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#metadata
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 304: Current directory hash matches hash parameters, so contents are unchanged.
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at given path.
|
||||
- 406: Too many file entries to return.
|
||||
"""
|
||||
path = "/metadata/%s%s" % (self.session.root, format_path(path))
|
||||
|
||||
params = {'file_limit': file_limit,
|
||||
'list': 'true',
|
||||
'include_deleted': include_deleted,
|
||||
}
|
||||
|
||||
if not list:
|
||||
params['list'] = 'false'
|
||||
if hash is not None:
|
||||
params['hash'] = hash
|
||||
if rev:
|
||||
params['rev'] = rev
|
||||
|
||||
url, params, headers = self.request(path, params, method='GET')
|
||||
|
||||
return self.rest_client.GET(url, headers)
|
||||
|
||||
def thumbnail(self, from_path, size='large', format='JPEG'):
|
||||
"""Download a thumbnail for an image.
|
||||
|
||||
Unlike most other calls, thumbnail returns a raw HTTPResponse with the connection open.
|
||||
You should call .read() and perform any processing you need, then close the HTTPResponse.
|
||||
|
||||
Args:
|
||||
- ``from_path``: The path to the file to be thumbnailed.
|
||||
- ``size``: A string describing the desired thumbnail size.
|
||||
At this time, 'small', 'medium', and 'large' are
|
||||
officially supported sizes (32x32, 64x64, and 128x128
|
||||
respectively), though others may be available. Check
|
||||
https://www.dropbox.com/developers/reference/api#thumbnails for
|
||||
more details.
|
||||
|
||||
Returns:
|
||||
- An httplib.HTTPResponse that is the result of the request.
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at the given from_path, or files of that type cannot be thumbnailed.
|
||||
- 415: Image is invalid and cannot be thumbnailed.
|
||||
"""
|
||||
assert format in ['JPEG', 'PNG'], "expected a thumbnail format of 'JPEG' or 'PNG', got %s" % format
|
||||
|
||||
path = "/thumbnails/%s%s" % (self.session.root, format_path(from_path))
|
||||
|
||||
url, params, headers = self.request(path, {'size': size, 'format': format}, method='GET', content_server=True)
|
||||
return self.rest_client.request("GET", url, headers=headers, raw_response=True)
|
||||
|
||||
def thumbnail_and_metadata(self, from_path, size='large', format='JPEG'):
|
||||
"""Download a thumbnail for an image alongwith its metadata.
|
||||
|
||||
Acts as a thin wrapper around thumbnail() (see thumbnail() comments for
|
||||
more details)
|
||||
|
||||
Args:
|
||||
- ``from_path``: The path to the file to be thumbnailed.
|
||||
- ``size``: A string describing the desired thumbnail size. See thumbnail()
|
||||
for details.
|
||||
|
||||
Returns:
|
||||
- An httplib.HTTPResponse that is the result of the request.
|
||||
- A dictionary containing the metadata of the file whose thumbnail
|
||||
was downloaded (see https://www.dropbox.com/developers/reference/api#metadata
|
||||
for details).
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No file was found at the given from_path, or files of that type cannot be thumbnailed.
|
||||
- 415: Image is invalid and cannot be thumbnailed.
|
||||
- 200: Request was okay but response was malformed in some way.
|
||||
"""
|
||||
thumbnail_res = self.thumbnail(from_path, size, format)
|
||||
metadata = DropboxClient.__parse_metadata_as_dict(thumbnail_res)
|
||||
|
||||
return thumbnail_res, metadata
|
||||
|
||||
def search(self, path, query, file_limit=1000, include_deleted=False):
|
||||
"""Search directory for filenames matching query.
|
||||
|
||||
Args:
|
||||
- ``path``: The directory to search within.
|
||||
- ``query``: The query to search on (minimum 3 characters).
|
||||
- ``file_limit``: The maximum number of file entries to return within a folder.
|
||||
The server will return at max 1,000 files.
|
||||
- ``include_deleted``: Whether to include deleted files in search results.
|
||||
|
||||
Returns:
|
||||
- A list of the metadata of all matching files (up to
|
||||
file_limit entries). For a detailed description of what
|
||||
this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#search
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
"""
|
||||
path = "/search/%s%s" % (self.session.root, format_path(path))
|
||||
|
||||
params = {
|
||||
'query': query,
|
||||
'file_limit': file_limit,
|
||||
'include_deleted': include_deleted,
|
||||
}
|
||||
|
||||
url, params, headers = self.request(path, params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
def revisions(self, path, rev_limit=1000):
|
||||
"""Retrieve revisions of a file.
|
||||
|
||||
Args:
|
||||
- ``path``: The file to fetch revisions for. Note that revisions
|
||||
are not available for folders.
|
||||
- ``rev_limit``: The maximum number of file entries to return within
|
||||
a folder. The server will return at max 1,000 revisions.
|
||||
|
||||
Returns:
|
||||
- A list of the metadata of all matching files (up to rev_limit entries).
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#revisions
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: No revisions were found at the given path.
|
||||
"""
|
||||
path = "/revisions/%s%s" % (self.session.root, format_path(path))
|
||||
|
||||
params = {
|
||||
'rev_limit': rev_limit,
|
||||
}
|
||||
|
||||
url, params, headers = self.request(path, params, method='GET')
|
||||
|
||||
return self.rest_client.GET(url, headers)
|
||||
|
||||
def restore(self, path, rev):
|
||||
"""Restore a file to a previous revision.
|
||||
|
||||
Args:
|
||||
- ``path``: The file to restore. Note that folders can't be restored.
|
||||
- ``rev``: A previous rev value of the file to be restored to.
|
||||
|
||||
Returns:
|
||||
- A dictionary containing the metadata of the newly restored file.
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#restore
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: Unable to find the file at the given revision.
|
||||
"""
|
||||
path = "/restore/%s%s" % (self.session.root, format_path(path))
|
||||
|
||||
params = {
|
||||
'rev': rev,
|
||||
}
|
||||
|
||||
url, params, headers = self.request(path, params)
|
||||
|
||||
return self.rest_client.POST(url, params, headers)
|
||||
|
||||
def media(self, path):
|
||||
"""Get a temporary unauthenticated URL for a media file.
|
||||
|
||||
All of Dropbox's API methods require OAuth, which may cause problems in
|
||||
situations where an application expects to be able to hit a URL multiple times
|
||||
(for example, a media player seeking around a video file). This method
|
||||
creates a time-limited URL that can be accessed without any authentication,
|
||||
and returns that to you, along with an expiration time.
|
||||
|
||||
Args:
|
||||
- ``path``: The file to return a URL for. Folders are not supported.
|
||||
|
||||
Returns:
|
||||
- A dictionary that looks like the following example:
|
||||
|
||||
``{'url': 'https://dl.dropbox.com/0/view/wvxv1fw6on24qw7/file.mov', 'expires': 'Thu, 16 Sep 2011 01:01:25 +0000'}``
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#media
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: Unable to find the file at the given path.
|
||||
"""
|
||||
path = "/media/%s%s" % (self.session.root, format_path(path))
|
||||
|
||||
url, params, headers = self.request(path, method='GET')
|
||||
|
||||
return self.rest_client.GET(url, headers)
|
||||
|
||||
def share(self, path):
|
||||
"""Create a shareable link to a file or folder.
|
||||
|
||||
Shareable links created on Dropbox are time-limited, but don't require any
|
||||
authentication, so they can be given out freely. The time limit should allow
|
||||
at least a day of shareability, though users have the ability to disable
|
||||
a link from their account if they like.
|
||||
|
||||
Args:
|
||||
- ``path``: The file or folder to share.
|
||||
|
||||
Returns:
|
||||
- A dictionary that looks like the following example:
|
||||
|
||||
``{'url': 'http://www.dropbox.com/s/m/a2mbDa2', 'expires': 'Thu, 16 Sep 2011 01:01:25 +0000'}``
|
||||
|
||||
For a detailed description of what this call returns, visit:
|
||||
https://www.dropbox.com/developers/reference/api#shares
|
||||
|
||||
Raises:
|
||||
- A dropbox.rest.ErrorResponse with an HTTP status of
|
||||
|
||||
- 400: Bad request (may be due to many things; check e.error for details)
|
||||
- 404: Unable to find the file at the given path.
|
||||
"""
|
||||
path = "/shares/%s%s" % (self.session.root, format_path(path))
|
||||
|
||||
url, params, headers = self.request(path, method='GET')
|
||||
|
||||
return self.rest_client.GET(url, headers)
|
4
resources/lib/dropbox/pkg_resources.py
Normal file
4
resources/lib/dropbox/pkg_resources.py
Normal file
@ -0,0 +1,4 @@
|
||||
import resources.lib.utils as utils
|
||||
|
||||
def resource_filename(*args):
|
||||
return utils.addon_dir() + "/resources/lib/dropbox/trusted-certs.crt"
|
317
resources/lib/dropbox/rest.py
Normal file
317
resources/lib/dropbox/rest.py
Normal file
@ -0,0 +1,317 @@
|
||||
"""
|
||||
A simple JSON REST request abstraction layer that is used by the
|
||||
dropbox.client and dropbox.session modules. You shouldn't need to use this.
|
||||
"""
|
||||
|
||||
import httplib
|
||||
import os
|
||||
import pkg_resources
|
||||
import re
|
||||
import socket
|
||||
import ssl
|
||||
import sys
|
||||
import urllib
|
||||
import urlparse
|
||||
from . import util
|
||||
|
||||
try:
|
||||
import json
|
||||
except ImportError:
|
||||
import simplejson as json
|
||||
|
||||
SDK_VERSION = "1.5.1"
|
||||
|
||||
TRUSTED_CERT_FILE = pkg_resources.resource_filename(__name__, 'trusted-certs.crt')
|
||||
|
||||
class ProperHTTPSConnection(httplib.HTTPConnection):
|
||||
"""
|
||||
httplib.HTTPSConnection is broken because it doesn't do server certificate
|
||||
validation. This class does certificate validation by ensuring:
|
||||
1. The certificate sent down by the server has a signature chain to one of
|
||||
the certs in our 'trusted-certs.crt' (this is mostly handled by the 'ssl'
|
||||
module).
|
||||
2. The hostname in the certificate matches the hostname we're connecting to.
|
||||
"""
|
||||
|
||||
def __init__(self, host, port, trusted_cert_file=TRUSTED_CERT_FILE):
|
||||
httplib.HTTPConnection.__init__(self, host, port)
|
||||
self.ca_certs = trusted_cert_file
|
||||
self.cert_reqs = ssl.CERT_REQUIRED
|
||||
|
||||
def connect(self):
|
||||
sock = create_connection((self.host, self.port))
|
||||
self.sock = ssl.wrap_socket(sock, cert_reqs=self.cert_reqs, ca_certs=self.ca_certs)
|
||||
cert = self.sock.getpeercert()
|
||||
hostname = self.host.split(':', 0)[0]
|
||||
match_hostname(cert, hostname)
|
||||
|
||||
class CertificateError(ValueError):
|
||||
pass
|
||||
|
||||
def _dnsname_to_pat(dn):
|
||||
pats = []
|
||||
for frag in dn.split(r'.'):
|
||||
if frag == '*':
|
||||
# When '*' is a fragment by itself, it matches a non-empty dotless
|
||||
# fragment.
|
||||
pats.append('[^.]+')
|
||||
else:
|
||||
# Otherwise, '*' matches any dotless fragment.
|
||||
frag = re.escape(frag)
|
||||
pats.append(frag.replace(r'\*', '[^.]*'))
|
||||
return re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE)
|
||||
|
||||
# This was ripped from Python 3.2 so it's not tested
|
||||
def match_hostname(cert, hostname):
|
||||
"""Verify that *cert* (in decoded format as returned by
|
||||
SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 rules
|
||||
are mostly followed, but IP addresses are not accepted for *hostname*.
|
||||
|
||||
CertificateError is raised on failure. On success, the function
|
||||
returns nothing.
|
||||
"""
|
||||
if not cert:
|
||||
raise ValueError("empty or no certificate")
|
||||
dnsnames = []
|
||||
san = cert.get('subjectAltName', ())
|
||||
for key, value in san:
|
||||
if key == 'DNS':
|
||||
if _dnsname_to_pat(value).match(hostname):
|
||||
return
|
||||
dnsnames.append(value)
|
||||
if not san:
|
||||
# The subject is only checked when subjectAltName is empty
|
||||
for sub in cert.get('subject', ()):
|
||||
for key, value in sub:
|
||||
# XXX according to RFC 2818, the most specific Common Name
|
||||
# must be used.
|
||||
if key == 'commonName':
|
||||
if _dnsname_to_pat(value).match(hostname):
|
||||
return
|
||||
dnsnames.append(value)
|
||||
if len(dnsnames) > 1:
|
||||
raise CertificateError("hostname %r doesn't match either of %s" % (hostname, ', '.join(map(repr, dnsnames))))
|
||||
elif len(dnsnames) == 1:
|
||||
raise CertificateError("hostname %r doesn't match %r" % (hostname, dnsnames[0]))
|
||||
else:
|
||||
raise CertificateError("no appropriate commonName or subjectAltName fields were found")
|
||||
|
||||
def create_connection(address):
|
||||
host, port = address
|
||||
err = None
|
||||
for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
|
||||
af, socktype, proto, canonname, sa = res
|
||||
sock = None
|
||||
try:
|
||||
sock = socket.socket(af, socktype, proto)
|
||||
sock.connect(sa)
|
||||
return sock
|
||||
|
||||
except socket.error, _:
|
||||
err = _
|
||||
if sock is not None:
|
||||
sock.close()
|
||||
|
||||
if err is not None:
|
||||
raise err
|
||||
else:
|
||||
raise socket.error("getaddrinfo returns an empty list")
|
||||
|
||||
def json_loadb(data):
|
||||
if sys.version_info >= (3,):
|
||||
data = data.decode('utf8')
|
||||
return json.loads(data)
|
||||
|
||||
class RESTClientObject(object):
|
||||
def __init__(self, http_connect=None):
|
||||
self.http_connect = http_connect
|
||||
|
||||
def request(self, method, url, post_params=None, body=None, headers=None, raw_response=False):
|
||||
post_params = post_params or {}
|
||||
headers = headers or {}
|
||||
headers['User-Agent'] = 'OfficialDropboxPythonSDK/' + SDK_VERSION
|
||||
|
||||
if post_params:
|
||||
if body:
|
||||
raise ValueError("body parameter cannot be used with post_params parameter")
|
||||
body = urllib.urlencode(post_params)
|
||||
headers["Content-type"] = "application/x-www-form-urlencoded"
|
||||
|
||||
# maintain dynamic lookup of ProperHTTPConnection
|
||||
http_connect = self.http_connect
|
||||
if http_connect is None:
|
||||
http_connect = ProperHTTPSConnection
|
||||
|
||||
host = urlparse.urlparse(url).hostname
|
||||
conn = http_connect(host, 443)
|
||||
|
||||
try:
|
||||
# This code is here because httplib in pre-2.6 Pythons
|
||||
# doesn't handle file-like objects as HTTP bodies and
|
||||
# thus requires manual buffering
|
||||
if not hasattr(body, 'read'):
|
||||
conn.request(method, url, body, headers)
|
||||
else:
|
||||
# Content-Length should be set to prevent upload truncation errors.
|
||||
clen, raw_data = util.analyze_file_obj(body)
|
||||
headers["Content-Length"] = str(clen)
|
||||
conn.request(method, url, "", headers)
|
||||
if raw_data is not None:
|
||||
conn.send(raw_data)
|
||||
else:
|
||||
BLOCKSIZE = 4 * 1024 * 1024 # 4MB buffering just because
|
||||
bytes_read = 0
|
||||
while True:
|
||||
data = body.read(BLOCKSIZE)
|
||||
if not data:
|
||||
break
|
||||
# Catch Content-Length overflow before the HTTP server does
|
||||
bytes_read += len(data)
|
||||
if bytes_read > clen:
|
||||
raise util.AnalyzeFileObjBug(clen, bytes_read)
|
||||
conn.send(data)
|
||||
if bytes_read != clen:
|
||||
raise util.AnalyzeFileObjBug(clen, bytes_read)
|
||||
|
||||
except socket.error, e:
|
||||
raise RESTSocketError(host, e)
|
||||
except CertificateError, e:
|
||||
raise RESTSocketError(host, "SSL certificate error: " + e)
|
||||
|
||||
r = conn.getresponse()
|
||||
if r.status != 200:
|
||||
raise ErrorResponse(r)
|
||||
|
||||
if raw_response:
|
||||
return r
|
||||
else:
|
||||
try:
|
||||
resp = json_loadb(r.read())
|
||||
except ValueError:
|
||||
raise ErrorResponse(r)
|
||||
finally:
|
||||
conn.close()
|
||||
|
||||
return resp
|
||||
|
||||
def GET(self, url, headers=None, raw_response=False):
|
||||
assert type(raw_response) == bool
|
||||
return self.request("GET", url, headers=headers, raw_response=raw_response)
|
||||
|
||||
def POST(self, url, params=None, headers=None, raw_response=False):
|
||||
assert type(raw_response) == bool
|
||||
if params is None:
|
||||
params = {}
|
||||
|
||||
return self.request("POST", url,
|
||||
post_params=params, headers=headers, raw_response=raw_response)
|
||||
|
||||
def PUT(self, url, body, headers=None, raw_response=False):
|
||||
assert type(raw_response) == bool
|
||||
return self.request("PUT", url, body=body, headers=headers, raw_response=raw_response)
|
||||
|
||||
class RESTClient(object):
|
||||
IMPL = RESTClientObject()
|
||||
|
||||
"""
|
||||
An class with all static methods to perform JSON REST requests that is used internally
|
||||
by the Dropbox Client API. It provides just enough gear to make requests
|
||||
and get responses as JSON data (when applicable). All requests happen over SSL.
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
def request(cls, *n, **kw):
|
||||
"""Perform a REST request and parse the response.
|
||||
|
||||
Args:
|
||||
- ``method``: An HTTP method (e.g. 'GET' or 'POST').
|
||||
- ``url``: The URL to make a request to.
|
||||
- ``post_params``: A dictionary of parameters to put in the body of the request.
|
||||
This option may not be used if the body parameter is given.
|
||||
- ``body``: The body of the request. Typically, this value will be a string.
|
||||
It may also be a file-like object in Python 2.6 and above. The body
|
||||
parameter may not be used with the post_params parameter.
|
||||
- ``headers``: A dictionary of headers to send with the request.
|
||||
- ``raw_response``: Whether to return the raw httplib.HTTPReponse object. [default False]
|
||||
It's best enabled for requests that return large amounts of data that you
|
||||
would want to .read() incrementally rather than loading into memory. Also
|
||||
use this for calls where you need to read metadata like status or headers,
|
||||
or if the body is not JSON.
|
||||
|
||||
Returns:
|
||||
- The JSON-decoded data from the server, unless raw_response is
|
||||
specified, in which case an httplib.HTTPReponse object is returned instead.
|
||||
|
||||
Raises:
|
||||
- dropbox.rest.ErrorResponse: The returned HTTP status is not 200, or the body was
|
||||
not parsed from JSON successfully.
|
||||
- dropbox.rest.RESTSocketError: A socket.error was raised while contacting Dropbox.
|
||||
"""
|
||||
return cls.IMPL.request(*n, **kw)
|
||||
|
||||
@classmethod
|
||||
def GET(cls, *n, **kw):
|
||||
"""Perform a GET request using RESTClient.request"""
|
||||
return cls.IMPL.GET(*n, **kw)
|
||||
|
||||
@classmethod
|
||||
def POST(cls, *n, **kw):
|
||||
"""Perform a POST request using RESTClient.request"""
|
||||
return cls.IMPL.POST(*n, **kw)
|
||||
|
||||
@classmethod
|
||||
def PUT(cls, *n, **kw):
|
||||
"""Perform a PUT request using RESTClient.request"""
|
||||
return cls.IMPL.PUT(*n, **kw)
|
||||
|
||||
class RESTSocketError(socket.error):
|
||||
"""
|
||||
A light wrapper for socket.errors raised by dropbox.rest.RESTClient.request
|
||||
that adds more information to the socket.error.
|
||||
"""
|
||||
|
||||
def __init__(self, host, e):
|
||||
msg = "Error connecting to \"%s\": %s" % (host, str(e))
|
||||
socket.error.__init__(self, msg)
|
||||
|
||||
class ErrorResponse(Exception):
|
||||
"""
|
||||
Raised by dropbox.rest.RESTClient.request for requests that:
|
||||
- Return a non-200 HTTP response, or
|
||||
- Have a non-JSON response body, or
|
||||
- Have a malformed/missing header in the response.
|
||||
|
||||
Most errors that Dropbox returns will have a error field that is unpacked and
|
||||
placed on the ErrorResponse exception. In some situations, a user_error field
|
||||
will also come back. Messages under user_error are worth showing to an end-user
|
||||
of your app, while other errors are likely only useful for you as the developer.
|
||||
"""
|
||||
|
||||
def __init__(self, http_resp):
|
||||
self.status = http_resp.status
|
||||
self.reason = http_resp.reason
|
||||
self.body = http_resp.read()
|
||||
self.headers = http_resp.getheaders()
|
||||
|
||||
try:
|
||||
self.body = json_loadb(self.body)
|
||||
self.error_msg = self.body.get('error')
|
||||
self.user_error_msg = self.body.get('user_error')
|
||||
except ValueError:
|
||||
self.error_msg = None
|
||||
self.user_error_msg = None
|
||||
|
||||
def __str__(self):
|
||||
if self.user_error_msg and self.user_error_msg != self.error_msg:
|
||||
# one is translated and the other is English
|
||||
msg = "%s (%s)" % (self.user_error_msg, self.error_msg)
|
||||
elif self.error_msg:
|
||||
msg = self.error_msg
|
||||
elif not self.body:
|
||||
msg = self.reason
|
||||
else:
|
||||
msg = "Error parsing response body or headers: " +\
|
||||
"Body - %s Headers - %s" % (self.body, self.headers)
|
||||
|
||||
return "[%d] %s" % (self.status, repr(msg))
|
||||
|
286
resources/lib/dropbox/session.py
Normal file
286
resources/lib/dropbox/session.py
Normal file
@ -0,0 +1,286 @@
|
||||
"""
|
||||
dropbox.session.DropboxSession is responsible for holding OAuth authentication info
|
||||
(app key/secret, request key/secret, access key/secret) as well as configuration information for your app
|
||||
('app_folder' or 'dropbox' access type, optional locale preference). It knows how to
|
||||
use all of this information to craft properly constructed requests to Dropbox.
|
||||
|
||||
A DropboxSession object must be passed to a dropbox.client.DropboxClient object upon
|
||||
initialization.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import random
|
||||
import sys
|
||||
import time
|
||||
import urllib
|
||||
|
||||
try:
|
||||
from urlparse import parse_qs
|
||||
except ImportError:
|
||||
# fall back for Python 2.5
|
||||
from cgi import parse_qs
|
||||
|
||||
from . import rest
|
||||
|
||||
class OAuthToken(object):
|
||||
__slots__ = ('key', 'secret')
|
||||
def __init__(self, key, secret):
|
||||
self.key = key
|
||||
self.secret = secret
|
||||
|
||||
class DropboxSession(object):
|
||||
API_VERSION = 1
|
||||
|
||||
API_HOST = "api.dropbox.com"
|
||||
WEB_HOST = "www.dropbox.com"
|
||||
API_CONTENT_HOST = "api-content.dropbox.com"
|
||||
|
||||
def __init__(self, consumer_key, consumer_secret, access_type, locale=None, rest_client=rest.RESTClient):
|
||||
"""Initialize a DropboxSession object.
|
||||
|
||||
Your consumer key and secret are available
|
||||
at https://www.dropbox.com/developers/apps
|
||||
|
||||
Args:
|
||||
- ``access_type``: Either 'dropbox' or 'app_folder'. All path-based operations
|
||||
will occur relative to either the user's Dropbox root directory
|
||||
or your application's app folder.
|
||||
- ``locale``: A locale string ('en', 'pt_PT', etc.) [optional]
|
||||
The locale setting will be used to translate any user-facing error
|
||||
messages that the server generates. At this time Dropbox supports
|
||||
'en', 'es', 'fr', 'de', and 'ja', though we will be supporting more
|
||||
languages in the future. If you send a language the server doesn't
|
||||
support, messages will remain in English. Look for these translated
|
||||
messages in rest.ErrorResponse exceptions as e.user_error_msg.
|
||||
"""
|
||||
assert access_type in ['dropbox', 'app_folder'], "expected access_type of 'dropbox' or 'app_folder'"
|
||||
self.consumer_creds = OAuthToken(consumer_key, consumer_secret)
|
||||
self.token = None
|
||||
self.request_token = None
|
||||
self.root = 'sandbox' if access_type == 'app_folder' else 'dropbox'
|
||||
self.locale = locale
|
||||
self.rest_client = rest_client
|
||||
|
||||
def is_linked(self):
|
||||
"""Return whether the DropboxSession has an access token attached."""
|
||||
return bool(self.token)
|
||||
|
||||
def unlink(self):
|
||||
"""Remove any attached access token from the DropboxSession."""
|
||||
self.token = None
|
||||
|
||||
def set_token(self, access_token, access_token_secret):
|
||||
"""Attach an access token to the DropboxSession.
|
||||
|
||||
Note that the access 'token' is made up of both a token string
|
||||
and a secret string.
|
||||
"""
|
||||
self.token = OAuthToken(access_token, access_token_secret)
|
||||
|
||||
def set_request_token(self, request_token, request_token_secret):
|
||||
"""Attach an request token to the DropboxSession.
|
||||
|
||||
Note that the request 'token' is made up of both a token string
|
||||
and a secret string.
|
||||
"""
|
||||
self.request_token = OAuthToken(request_token, request_token_secret)
|
||||
|
||||
def build_path(self, target, params=None):
|
||||
"""Build the path component for an API URL.
|
||||
|
||||
This method urlencodes the parameters, adds them
|
||||
to the end of the target url, and puts a marker for the API
|
||||
version in front.
|
||||
|
||||
Args:
|
||||
- ``target``: A target url (e.g. '/files') to build upon.
|
||||
- ``params``: A dictionary of parameters (name to value). [optional]
|
||||
|
||||
Returns:
|
||||
- The path and parameters components of an API URL.
|
||||
"""
|
||||
if sys.version_info < (3,) and type(target) == unicode:
|
||||
target = target.encode("utf8")
|
||||
|
||||
target_path = urllib.quote(target)
|
||||
|
||||
params = params or {}
|
||||
params = params.copy()
|
||||
|
||||
if self.locale:
|
||||
params['locale'] = self.locale
|
||||
|
||||
if params:
|
||||
return "/%d%s?%s" % (self.API_VERSION, target_path, urllib.urlencode(params))
|
||||
else:
|
||||
return "/%d%s" % (self.API_VERSION, target_path)
|
||||
|
||||
def build_url(self, host, target, params=None):
|
||||
"""Build an API URL.
|
||||
|
||||
This method adds scheme and hostname to the path
|
||||
returned from build_path.
|
||||
|
||||
Args:
|
||||
- ``target``: A target url (e.g. '/files') to build upon.
|
||||
- ``params``: A dictionary of parameters (name to value). [optional]
|
||||
|
||||
Returns:
|
||||
- The full API URL.
|
||||
"""
|
||||
return "https://%s%s" % (host, self.build_path(target, params))
|
||||
|
||||
def build_authorize_url(self, request_token, oauth_callback=None):
|
||||
"""Build a request token authorization URL.
|
||||
|
||||
After obtaining a request token, you'll need to send the user to
|
||||
the URL returned from this function so that they can confirm that
|
||||
they want to connect their account to your app.
|
||||
|
||||
Args:
|
||||
- ``request_token``: A request token from obtain_request_token.
|
||||
- ``oauth_callback``: A url to redirect back to with the authorized
|
||||
request token.
|
||||
|
||||
Returns:
|
||||
- An authorization for the given request token.
|
||||
"""
|
||||
params = {'oauth_token': request_token.key,
|
||||
}
|
||||
|
||||
if oauth_callback:
|
||||
params['oauth_callback'] = oauth_callback
|
||||
|
||||
return self.build_url(self.WEB_HOST, '/oauth/authorize', params)
|
||||
|
||||
def obtain_request_token(self):
|
||||
"""Obtain a request token from the Dropbox API.
|
||||
|
||||
This is your first step in the OAuth process. You call this to get a
|
||||
request_token from the Dropbox server that you can then use with
|
||||
DropboxSession.build_authorize_url() to get the user to authorize it.
|
||||
After it's authorized you use this token with
|
||||
DropboxSession.obtain_access_token() to get an access token.
|
||||
|
||||
NOTE: You should only need to do this once for each user, and then you
|
||||
can store the access token for that user for later operations.
|
||||
|
||||
Returns:
|
||||
- An dropbox.session.OAuthToken representing the request token Dropbox assigned
|
||||
to this app. Also attaches the request token as self.request_token.
|
||||
"""
|
||||
self.token = None # clear any token currently on the request
|
||||
url = self.build_url(self.API_HOST, '/oauth/request_token')
|
||||
headers, params = self.build_access_headers('POST', url)
|
||||
|
||||
response = self.rest_client.POST(url, headers=headers, params=params, raw_response=True)
|
||||
self.request_token = self._parse_token(response.read())
|
||||
return self.request_token
|
||||
|
||||
def obtain_access_token(self, request_token=None):
|
||||
"""Obtain an access token for a user.
|
||||
|
||||
After you get a request token, and then send the user to the authorize
|
||||
URL, you can use the authorized request token with this method to get the
|
||||
access token to use for future operations. The access token is stored on
|
||||
the session object.
|
||||
|
||||
Args:
|
||||
- ``request_token``: A request token from obtain_request_token. [optional]
|
||||
The request_token should have been authorized via the
|
||||
authorization url from build_authorize_url. If you don't pass
|
||||
a request_token, the fallback is self.request_token, which
|
||||
will exist if you previously called obtain_request_token on this
|
||||
DropboxSession instance.
|
||||
|
||||
Returns:
|
||||
- An tuple of (key, secret) representing the access token Dropbox assigned
|
||||
to this app and user. Also attaches the access token as self.token.
|
||||
"""
|
||||
request_token = request_token or self.request_token
|
||||
assert request_token, "No request_token available on the session. Please pass one."
|
||||
url = self.build_url(self.API_HOST, '/oauth/access_token')
|
||||
headers, params = self.build_access_headers('POST', url, request_token=request_token)
|
||||
|
||||
response = self.rest_client.POST(url, headers=headers, params=params, raw_response=True)
|
||||
self.token = self._parse_token(response.read())
|
||||
return self.token
|
||||
|
||||
def build_access_headers(self, method, resource_url, params=None, request_token=None):
|
||||
"""Build OAuth access headers for a future request.
|
||||
|
||||
Args:
|
||||
- ``method``: The HTTP method being used (e.g. 'GET' or 'POST').
|
||||
- ``resource_url``: The full url the request will be made to.
|
||||
- ``params``: A dictionary of parameters to add to what's already on the url.
|
||||
Typically, this would consist of POST parameters.
|
||||
|
||||
Returns:
|
||||
- A tuple of (header_dict, params) where header_dict is a dictionary
|
||||
of header names and values appropriate for passing into dropbox.rest.RESTClient
|
||||
and params is a dictionary like the one that was passed in, but augmented with
|
||||
oauth-related parameters as appropriate.
|
||||
"""
|
||||
if params is None:
|
||||
params = {}
|
||||
else:
|
||||
params = params.copy()
|
||||
|
||||
oauth_params = {
|
||||
'oauth_consumer_key' : self.consumer_creds.key,
|
||||
'oauth_timestamp' : self._generate_oauth_timestamp(),
|
||||
'oauth_nonce' : self._generate_oauth_nonce(),
|
||||
'oauth_version' : self._oauth_version(),
|
||||
}
|
||||
|
||||
token = request_token if request_token is not None else self.token
|
||||
|
||||
if token:
|
||||
oauth_params['oauth_token'] = token.key
|
||||
|
||||
self._oauth_sign_request(oauth_params, self.consumer_creds, token)
|
||||
|
||||
params.update(oauth_params)
|
||||
|
||||
return {}, params
|
||||
|
||||
@classmethod
|
||||
def _oauth_sign_request(cls, params, consumer_pair, token_pair):
|
||||
params.update({'oauth_signature_method' : 'PLAINTEXT',
|
||||
'oauth_signature' : ('%s&%s' % (consumer_pair.secret, token_pair.secret)
|
||||
if token_pair is not None else
|
||||
'%s&' % (consumer_pair.secret,))})
|
||||
|
||||
@classmethod
|
||||
def _generate_oauth_timestamp(cls):
|
||||
return int(time.time())
|
||||
|
||||
@classmethod
|
||||
def _generate_oauth_nonce(cls, length=8):
|
||||
return ''.join([str(random.randint(0, 9)) for i in range(length)])
|
||||
|
||||
@classmethod
|
||||
def _oauth_version(cls):
|
||||
return '1.0'
|
||||
|
||||
@classmethod
|
||||
def _parse_token(cls, s):
|
||||
if not s:
|
||||
raise ValueError("Invalid parameter string.")
|
||||
|
||||
params = parse_qs(s, keep_blank_values=False)
|
||||
if not params:
|
||||
raise ValueError("Invalid parameter string: %r" % s)
|
||||
|
||||
try:
|
||||
key = params['oauth_token'][0]
|
||||
except Exception:
|
||||
raise ValueError("'oauth_token' not found in OAuth request.")
|
||||
|
||||
try:
|
||||
secret = params['oauth_token_secret'][0]
|
||||
except Exception:
|
||||
raise ValueError("'oauth_token_secret' not found in "
|
||||
"OAuth request.")
|
||||
|
||||
return OAuthToken(key, secret)
|
11
resources/lib/dropbox/six.py
Normal file
11
resources/lib/dropbox/six.py
Normal file
@ -0,0 +1,11 @@
|
||||
import sys
|
||||
|
||||
def b(str_):
|
||||
if sys.version_info >= (3,):
|
||||
str_ = str_.encode('latin1')
|
||||
return str_
|
||||
|
||||
def u(str_):
|
||||
if sys.version_info < (3,):
|
||||
str_ = str_.decode('latin1')
|
||||
return str_
|
341
resources/lib/dropbox/trusted-certs.crt
Normal file
341
resources/lib/dropbox/trusted-certs.crt
Normal file
@ -0,0 +1,341 @@
|
||||
# Subject: C=ZA, ST=Western Cape, L=Cape Town, O=Thawte Consulting cc, OU=Certification Services Division, CN=Thawte Server CA/emailAddress=server-certs@thawte.com <server-certs@thawte.com>
|
||||
# Issuer: C=ZA, ST=Western Cape, L=Cape Town, O=Thawte Consulting cc, OU=Certification Services Division, CN=Thawte Server CA/emailAddress= server-certs@thawte.com
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDEzCCAnygAwIBAgIBATANBgkqhkiG9w0BAQQFADCBxDELMAkGA1UEBhMCWkEx
|
||||
FTATBgNVBAgTDFdlc3Rlcm4gQ2FwZTESMBAGA1UEBxMJQ2FwZSBUb3duMR0wGwYD
|
||||
VQQKExRUaGF3dGUgQ29uc3VsdGluZyBjYzEoMCYGA1UECxMfQ2VydGlmaWNhdGlv
|
||||
biBTZXJ2aWNlcyBEaXZpc2lvbjEZMBcGA1UEAxMQVGhhd3RlIFNlcnZlciBDQTEm
|
||||
MCQGCSqGSIb3DQEJARYXc2VydmVyLWNlcnRzQHRoYXd0ZS5jb20wHhcNOTYwODAx
|
||||
MDAwMDAwWhcNMjAxMjMxMjM1OTU5WjCBxDELMAkGA1UEBhMCWkExFTATBgNVBAgT
|
||||
DFdlc3Rlcm4gQ2FwZTESMBAGA1UEBxMJQ2FwZSBUb3duMR0wGwYDVQQKExRUaGF3
|
||||
dGUgQ29uc3VsdGluZyBjYzEoMCYGA1UECxMfQ2VydGlmaWNhdGlvbiBTZXJ2aWNl
|
||||
cyBEaXZpc2lvbjEZMBcGA1UEAxMQVGhhd3RlIFNlcnZlciBDQTEmMCQGCSqGSIb3
|
||||
DQEJARYXc2VydmVyLWNlcnRzQHRoYXd0ZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQAD
|
||||
gY0AMIGJAoGBANOkUG7I/1Zr5s9dtuoMaHVHoqrC2oQl/Kj0R1HahbUgdJSGHg91
|
||||
yekIYfUGbTBuFRkC6VLAYttNmZ7iagxEOM3+vuNkCXDF/rFrKbYvScg71CcEJRCX
|
||||
L+eQbcAoQpnXTEPew/UhbVSfXcNY4cDk2VuwuNy0e982OsK1ZiIS1ocNAgMBAAGj
|
||||
EzARMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQEEBQADgYEAB/pMaVz7lcxG
|
||||
7oWDTSEwjsrZqG9JGubaUeNgcGyEYRGhGshIPllDfU+VPaGLtwtimHp1it2ITk6e
|
||||
QNuozDJ0uW8NxuOzRAvZim+aKZuZGCg70eNAKJpaPNW15yAbi8qkq43pUdniTCxZ
|
||||
qdq5snUb9kLy78fyGPmJvKP/iiMucEc=
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=ZA, ST=Western Cape, L=Cape Town, O=Thawte Consulting cc, OU=Certification Services Division, CN=Thawte Premium Server CA/emailAddress=premium-server@thawte.com <premium-server@thawte.com>
|
||||
# Issuer: C=ZA, ST=Western Cape, L=Cape Town, O=Thawte Consulting cc, OU=Certification Services Division, CN=Thawte Premium Server CA/emailAddress=premium-server@thawte.com <premium-server@thawte.com>
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDJzCCApCgAwIBAgIBATANBgkqhkiG9w0BAQQFADCBzjELMAkGA1UEBhMCWkEx
|
||||
FTATBgNVBAgTDFdlc3Rlcm4gQ2FwZTESMBAGA1UEBxMJQ2FwZSBUb3duMR0wGwYD
|
||||
VQQKExRUaGF3dGUgQ29uc3VsdGluZyBjYzEoMCYGA1UECxMfQ2VydGlmaWNhdGlv
|
||||
biBTZXJ2aWNlcyBEaXZpc2lvbjEhMB8GA1UEAxMYVGhhd3RlIFByZW1pdW0gU2Vy
|
||||
dmVyIENBMSgwJgYJKoZIhvcNAQkBFhlwcmVtaXVtLXNlcnZlckB0aGF3dGUuY29t
|
||||
MB4XDTk2MDgwMTAwMDAwMFoXDTIwMTIzMTIzNTk1OVowgc4xCzAJBgNVBAYTAlpB
|
||||
MRUwEwYDVQQIEwxXZXN0ZXJuIENhcGUxEjAQBgNVBAcTCUNhcGUgVG93bjEdMBsG
|
||||
A1UEChMUVGhhd3RlIENvbnN1bHRpbmcgY2MxKDAmBgNVBAsTH0NlcnRpZmljYXRp
|
||||
b24gU2VydmljZXMgRGl2aXNpb24xITAfBgNVBAMTGFRoYXd0ZSBQcmVtaXVtIFNl
|
||||
cnZlciBDQTEoMCYGCSqGSIb3DQEJARYZcHJlbWl1bS1zZXJ2ZXJAdGhhd3RlLmNv
|
||||
bTCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEA0jY2aovXwlue2oFBYo847kkE
|
||||
VdbQ7xwblRZH7xhINTpS9CtqBo87L+pW46+GjZ4X9560ZXUCTe/LCaIhUdib0GfQ
|
||||
ug2SBhRz1JPLlyoAnFxODLz6FVL88kRu2hFKbgifLy3j+ao6hnO2RlNYyIkFvYMR
|
||||
uHM/qgeN9EJN50CdHDcCAwEAAaMTMBEwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG
|
||||
9w0BAQQFAAOBgQAmSCwWwlj66BZ0DKqqX1Q/8tfJeGBeXm43YyJ3Nn6yF8Q0ufUI
|
||||
hfzJATj/Tb7yFkJD57taRvvBxhEf8UqwKEbJw8RCfbz6q1lu1bdRiBHjpIUZa4JM
|
||||
pAwSremkrj/xw0llmozFyD4lt5SZu5IycQfwhl7tUCemDaYj+bvLpgcUQg==
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=VeriSign, Inc., OU=Class 1 Public Primary Certification Authority
|
||||
# Issuer: C=US, O=VeriSign, Inc., OU=Class 1 Public Primary Certification Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIICPDCCAaUCEDJQM89Q0VbzXIGtZVxPyCUwDQYJKoZIhvcNAQECBQAwXzELMAkG
|
||||
A1UEBhMCVVMxFzAVBgNVBAoTDlZlcmlTaWduLCBJbmMuMTcwNQYDVQQLEy5DbGFz
|
||||
cyAxIFB1YmxpYyBQcmltYXJ5IENlcnRpZmljYXRpb24gQXV0aG9yaXR5MB4XDTk2
|
||||
MDEyOTAwMDAwMFoXDTIwMDEwNzIzNTk1OVowXzELMAkGA1UEBhMCVVMxFzAVBgNV
|
||||
BAoTDlZlcmlTaWduLCBJbmMuMTcwNQYDVQQLEy5DbGFzcyAxIFB1YmxpYyBQcmlt
|
||||
YXJ5IENlcnRpZmljYXRpb24gQXV0aG9yaXR5MIGfMA0GCSqGSIb3DQEBAQUAA4GN
|
||||
ADCBiQKBgQDlGb9to1ZhLZlIcfZn3rmN67eehoAKkQ76OCWvRoiC5XOooJskXQ0f
|
||||
zGVuDLDQVoQYh5oGmxChc9+0WDlrbsH2FdWoqD+qEgaNMax/sDTXjzRniAnNFBHi
|
||||
TkVWaR94AoDa3EeRKbs2yWNcxeDXLYd7obcysHswuiovMaruo2fa2wIDAQABMA0G
|
||||
CSqGSIb3DQEBAgUAA4GBAEtEZmBoZOSYG/OwcuaViXzde7OVwB0u2NgZ0C00PcZQ
|
||||
mhCGjKo/O6gE/DdSlcPZydvN8oYGxLEb8IKIMEKOF1AcZHq4PplJdJf8rAJD+5YM
|
||||
VgQlDHx8h50kp9jwMim1pN9dokzFFjKoQvZFprY2ueC/ZTaTwtLXa9zeWdaiNfhF
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=VeriSign, Inc., OU=Class 3 Public Primary Certification Authority
|
||||
# Issuer: C=US, O=VeriSign, Inc., OU=Class 3 Public Primary Certification Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIICPDCCAaUCEHC65B0Q2Sk0tjjKewPMur8wDQYJKoZIhvcNAQECBQAwXzELMAkG
|
||||
A1UEBhMCVVMxFzAVBgNVBAoTDlZlcmlTaWduLCBJbmMuMTcwNQYDVQQLEy5DbGFz
|
||||
cyAzIFB1YmxpYyBQcmltYXJ5IENlcnRpZmljYXRpb24gQXV0aG9yaXR5MB4XDTk2
|
||||
MDEyOTAwMDAwMFoXDTI4MDgwMTIzNTk1OVowXzELMAkGA1UEBhMCVVMxFzAVBgNV
|
||||
BAoTDlZlcmlTaWduLCBJbmMuMTcwNQYDVQQLEy5DbGFzcyAzIFB1YmxpYyBQcmlt
|
||||
YXJ5IENlcnRpZmljYXRpb24gQXV0aG9yaXR5MIGfMA0GCSqGSIb3DQEBAQUAA4GN
|
||||
ADCBiQKBgQDJXFme8huKARS0EN8EQNvjV69qRUCPhAwL0TPZ2RHP7gJYHyX3KqhE
|
||||
BarsAx94f56TuZoAqiN91qyFomNFx3InzPRMxnVx0jnvT0Lwdd8KkMaOIG+YD/is
|
||||
I19wKTakyYbnsZogy1Olhec9vn2a/iRFM9x2Fe0PonFkTGUugWhFpwIDAQABMA0G
|
||||
CSqGSIb3DQEBAgUAA4GBALtMEivPLCYATxQT3ab7/AoRhIzzKBxnki98tsX63/Do
|
||||
lbwdj2wsqFHMc9ikwFPwTtYmwHYBV4GSXiHx0bH/59AhWM1pF+NEHJwZRDmJXNyc
|
||||
AA9WjQKZ7aKQRUzkuxCkPfAyAw7xzvjoyVGM5mKf5p/AfbdynMk2OmufTqj/ZA1k
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=RSA Data Security, Inc., OU=Secure Server Certification Authority
|
||||
# Issuer: C=US, O=RSA Data Security, Inc., OU=Secure Server Certification Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIICNDCCAaECEAKtZn5ORf5eV288mBle3cAwDQYJKoZIhvcNAQECBQAwXzELMAkG
|
||||
A1UEBhMCVVMxIDAeBgNVBAoTF1JTQSBEYXRhIFNlY3VyaXR5LCBJbmMuMS4wLAYD
|
||||
VQQLEyVTZWN1cmUgU2VydmVyIENlcnRpZmljYXRpb24gQXV0aG9yaXR5MB4XDTk0
|
||||
MTEwOTAwMDAwMFoXDTEwMDEwNzIzNTk1OVowXzELMAkGA1UEBhMCVVMxIDAeBgNV
|
||||
BAoTF1JTQSBEYXRhIFNlY3VyaXR5LCBJbmMuMS4wLAYDVQQLEyVTZWN1cmUgU2Vy
|
||||
dmVyIENlcnRpZmljYXRpb24gQXV0aG9yaXR5MIGbMA0GCSqGSIb3DQEBAQUAA4GJ
|
||||
ADCBhQJ+AJLOesGugz5aqomDV6wlAXYMra6OLDfO6zV4ZFQD5YRAUcm/jwjiioII
|
||||
0haGN1XpsSECrXZogZoFokvJSyVmIlZsiAeP94FZbYQHZXATcXY+m3dM41CJVphI
|
||||
uR2nKRoTLkoRWZweFdVJVCxzOmmCsZc5nG1wZ0jl3S3WyB57AgMBAAEwDQYJKoZI
|
||||
hvcNAQECBQADfgBl3X7hsuyw4jrg7HFGmhkRuNPHoLQDQCYCPgmc4RKz0Vr2N6W3
|
||||
YQO2WxZpO8ZECAyIUwxrl0nHPjXcbLm7qt9cuzovk2C2qUtN8iD3zV9/ZHuO3ABc
|
||||
1/p3yjkWWW8O6tO1g39NTUJWdrTJXwT4OPjr0l91X817/OWOgHz8UA==
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=Equifax Secure Inc., CN=Equifax Secure Global eBusiness CA-1
|
||||
# Issuer: C=US, O=Equifax Secure Inc., CN=Equifax Secure Global eBusiness CA-1
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIICkDCCAfmgAwIBAgIBATANBgkqhkiG9w0BAQQFADBaMQswCQYDVQQGEwJVUzEc
|
||||
MBoGA1UEChMTRXF1aWZheCBTZWN1cmUgSW5jLjEtMCsGA1UEAxMkRXF1aWZheCBT
|
||||
ZWN1cmUgR2xvYmFsIGVCdXNpbmVzcyBDQS0xMB4XDTk5MDYyMTA0MDAwMFoXDTIw
|
||||
MDYyMTA0MDAwMFowWjELMAkGA1UEBhMCVVMxHDAaBgNVBAoTE0VxdWlmYXggU2Vj
|
||||
dXJlIEluYy4xLTArBgNVBAMTJEVxdWlmYXggU2VjdXJlIEdsb2JhbCBlQnVzaW5l
|
||||
c3MgQ0EtMTCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEAuucXkAJlsTRVPEnC
|
||||
UdXfp9E3j9HngXNBUmCbnaEXJnitx7HoJpQytd4zjTov2/KaelpzmKNc6fuKcxtc
|
||||
58O/gGzNqfTWK8D3+ZmqY6KxRwIP1ORROhI8bIpaVIRw28HFkM9yRcuoWcDNM50/
|
||||
o5brhTMhHD4ePmBudpxnhcXIw2ECAwEAAaNmMGQwEQYJYIZIAYb4QgEBBAQDAgAH
|
||||
MA8GA1UdEwEB/wQFMAMBAf8wHwYDVR0jBBgwFoAUvqigdHJQa0S3ySPY+6j/s1dr
|
||||
aGwwHQYDVR0OBBYEFL6ooHRyUGtEt8kj2Puo/7NXa2hsMA0GCSqGSIb3DQEBBAUA
|
||||
A4GBADDiAVGqx+pf2rnQZQ8w1j7aDRRJbpGTJxQx78T3LUX47Me/okENI7SS+RkA
|
||||
Z70Br83gcfxaz2TE4JaY0KNA4gGK7ycH8WUBikQtBmV1UsCGECAhX2xrD2yuCRyv
|
||||
8qIYNMR1pHMc8Y3c7635s3a0kr/clRAevsvIO1qEYBlWlKlV
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, ST=UT, L=Salt Lake City, O=The USERTRUST Network, OU=http://www.usertrust.com, CN=UTN-USERFirst-Hardware
|
||||
# Issuer: C=US, ST=UT, L=Salt Lake City, O=The USERTRUST Network, OU=http://www.usertrust.com, CN=UTN-USERFirst-Hardware
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIEdDCCA1ygAwIBAgIQRL4Mi1AAJLQR0zYq/mUK/TANBgkqhkiG9w0BAQUFADCB
|
||||
lzELMAkGA1UEBhMCVVMxCzAJBgNVBAgTAlVUMRcwFQYDVQQHEw5TYWx0IExha2Ug
|
||||
Q2l0eTEeMBwGA1UEChMVVGhlIFVTRVJUUlVTVCBOZXR3b3JrMSEwHwYDVQQLExho
|
||||
dHRwOi8vd3d3LnVzZXJ0cnVzdC5jb20xHzAdBgNVBAMTFlVUTi1VU0VSRmlyc3Qt
|
||||
SGFyZHdhcmUwHhcNOTkwNzA5MTgxMDQyWhcNMTkwNzA5MTgxOTIyWjCBlzELMAkG
|
||||
A1UEBhMCVVMxCzAJBgNVBAgTAlVUMRcwFQYDVQQHEw5TYWx0IExha2UgQ2l0eTEe
|
||||
MBwGA1UEChMVVGhlIFVTRVJUUlVTVCBOZXR3b3JrMSEwHwYDVQQLExhodHRwOi8v
|
||||
d3d3LnVzZXJ0cnVzdC5jb20xHzAdBgNVBAMTFlVUTi1VU0VSRmlyc3QtSGFyZHdh
|
||||
cmUwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCx98M4P7Sof885glFn
|
||||
0G2f0v9Y8+efK+wNiVSZuTiZFvfgIXlIwrthdBKWHTxqctU8EGc6Oe0rE81m65UJ
|
||||
M6Rsl7HoxuzBdXmcRl6Nq9Bq/bkqVRcQVLMZ8Jr28bFdtqdt++BxF2uiiPsA3/4a
|
||||
MXcMmgF6sTLjKwEHOG7DpV4jvEWbe1DByTCP2+UretNb+zNAHqDVmBe8i4fDidNd
|
||||
oI6yqqr2jmmIBsX6iSHzCJ1pLgkzmykNRg+MzEk0sGlRvfkGzWitZky8PqxhvQqI
|
||||
DsjfPe58BEydCl5rkdbux+0ojatNh4lz0G6k0B4WixThdkQDf2Os5M1JnMWS9Ksy
|
||||
oUhbAgMBAAGjgbkwgbYwCwYDVR0PBAQDAgHGMA8GA1UdEwEB/wQFMAMBAf8wHQYD
|
||||
VR0OBBYEFKFyXyYbKJhDlV0HN9WFlp1L0sNFMEQGA1UdHwQ9MDswOaA3oDWGM2h0
|
||||
dHA6Ly9jcmwudXNlcnRydXN0LmNvbS9VVE4tVVNFUkZpcnN0LUhhcmR3YXJlLmNy
|
||||
bDAxBgNVHSUEKjAoBggrBgEFBQcDAQYIKwYBBQUHAwUGCCsGAQUFBwMGBggrBgEF
|
||||
BQcDBzANBgkqhkiG9w0BAQUFAAOCAQEARxkP3nTGmZev/K0oXnWO6y1n7k57K9cM
|
||||
//bey1WiCuFMVGWTYGufEpytXoMs61quwOQt9ABjHbjAbPLPSbtNk28Gpgoiskli
|
||||
CE7/yMgUsogWXecB5BKV5UU0s4tpvc+0hY91UZ59Ojg6FEgSxvunOxqNDYJAB+gE
|
||||
CJChicsZUN/KHAG8HQQZexB2lzvukJDKxA4fFm517zP4029bHpbj4HR3dHuKom4t
|
||||
3XbWOTCC8KucUvIqx69JXn7HaOWCgchqJ/kniCrVWFCVH/A7HFe7fRQ5YiuayZSS
|
||||
KqMiDP+JJn1fIytH1xUdqWqeUQ0qUZ6B+dQ7XnASfxAynB67nfhmqA==
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=Network Solutions L.L.C., CN=Network Solutions Certificate Authority
|
||||
# Issuer: C=US, O=Network Solutions L.L.C., CN=Network Solutions Certificate Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIID5jCCAs6gAwIBAgIQV8szb8JcFuZHFhfjkDFo4DANBgkqhkiG9w0BAQUFADBi
|
||||
MQswCQYDVQQGEwJVUzEhMB8GA1UEChMYTmV0d29yayBTb2x1dGlvbnMgTC5MLkMu
|
||||
MTAwLgYDVQQDEydOZXR3b3JrIFNvbHV0aW9ucyBDZXJ0aWZpY2F0ZSBBdXRob3Jp
|
||||
dHkwHhcNMDYxMjAxMDAwMDAwWhcNMjkxMjMxMjM1OTU5WjBiMQswCQYDVQQGEwJV
|
||||
UzEhMB8GA1UEChMYTmV0d29yayBTb2x1dGlvbnMgTC5MLkMuMTAwLgYDVQQDEydO
|
||||
ZXR3b3JrIFNvbHV0aW9ucyBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkwggEiMA0GCSqG
|
||||
SIb3DQEBAQUAA4IBDwAwggEKAoIBAQDkvH6SMG3G2I4rC7xGzuAnlt7e+foS0zwz
|
||||
c7MEL7xxjOWftiJgPl9dzgn/ggwbmlFQGiaJ3dVhXRncEg8tCqJDXRfQNJIg6nPP
|
||||
OCwGJgl6cvf6UDL4wpPTaaIjzkGxzOTVHzbRijr4jGPiFFlp7Q3Tf2vouAPlT2rl
|
||||
mGNpSAW+Lv8ztumXWWn4Zxmuk2GWRBXTcrA/vGp97Eh/jcOrqnErU2lBUzS1sLnF
|
||||
BgrEsEX1QV1uiUV7PTsmjHTC5dLRfbIR1PtYMiKagMnc/Qzpf14Dl847ABSHJ3A4
|
||||
qY5usyd2mFHgBeMhqxrVhSI8KbWaFsWAqPS7azCPL0YCorEMIuDTAgMBAAGjgZcw
|
||||
gZQwHQYDVR0OBBYEFCEwyfsA106Y2oeqKtCnLrFAMadMMA4GA1UdDwEB/wQEAwIB
|
||||
BjAPBgNVHRMBAf8EBTADAQH/MFIGA1UdHwRLMEkwR6BFoEOGQWh0dHA6Ly9jcmwu
|
||||
bmV0c29sc3NsLmNvbS9OZXR3b3JrU29sdXRpb25zQ2VydGlmaWNhdGVBdXRob3Jp
|
||||
dHkuY3JsMA0GCSqGSIb3DQEBBQUAA4IBAQC7rkvnt1frf6ott3NHhWrB5KUd5Oc8
|
||||
6fRZZXe1eltajSU24HqXLjjAV2CDmAaDn7l2em5Q4LqILPxFzBiwmZVRDuwduIj/
|
||||
h1AcgsLj4DKAv6ALR8jDMe+ZZzKATxcheQxpXN5eNK4CtSbqUN9/GGUsyfJj4akH
|
||||
/nxxH2szJGoeBfcFaMBqEssuXmHLrijTfsK0ZpEmXzwuJF/LWA/rKOyvEZbz3Htv
|
||||
wKeI8lN3s2Berq4o2jUsbzRF0ybh3uxbTydrFny9RAQYgrOJeRcQcT16ohZO9QHN
|
||||
pGxlaKFJdlxDydi8NmdspZS11My5vWo1ViHe2MPr+8ukYEywVaCge1ey
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=The Go Daddy Group, Inc., OU=Go Daddy Class 2 Certification Authority
|
||||
# Issuer: C=US, O=The Go Daddy Group, Inc., OU=Go Daddy Class 2 Certification Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIEADCCAuigAwIBAgIBADANBgkqhkiG9w0BAQUFADBjMQswCQYDVQQGEwJVUzEh
|
||||
MB8GA1UEChMYVGhlIEdvIERhZGR5IEdyb3VwLCBJbmMuMTEwLwYDVQQLEyhHbyBE
|
||||
YWRkeSBDbGFzcyAyIENlcnRpZmljYXRpb24gQXV0aG9yaXR5MB4XDTA0MDYyOTE3
|
||||
MDYyMFoXDTM0MDYyOTE3MDYyMFowYzELMAkGA1UEBhMCVVMxITAfBgNVBAoTGFRo
|
||||
ZSBHbyBEYWRkeSBHcm91cCwgSW5jLjExMC8GA1UECxMoR28gRGFkZHkgQ2xhc3Mg
|
||||
MiBDZXJ0aWZpY2F0aW9uIEF1dGhvcml0eTCCASAwDQYJKoZIhvcNAQEBBQADggEN
|
||||
ADCCAQgCggEBAN6d1+pXGEmhW+vXX0iG6r7d/+TvZxz0ZWizV3GgXne77ZtJ6XCA
|
||||
PVYYYwhv2vLM0D9/AlQiVBDYsoHUwHU9S3/Hd8M+eKsaA7Ugay9qK7HFiH7Eux6w
|
||||
wdhFJ2+qN1j3hybX2C32qRe3H3I2TqYXP2WYktsqbl2i/ojgC95/5Y0V4evLOtXi
|
||||
EqITLdiOr18SPaAIBQi2XKVlOARFmR6jYGB0xUGlcmIbYsUfb18aQr4CUWWoriMY
|
||||
avx4A6lNf4DD+qta/KFApMoZFv6yyO9ecw3ud72a9nmYvLEHZ6IVDd2gWMZEewo+
|
||||
YihfukEHU1jPEX44dMX4/7VpkI+EdOqXG68CAQOjgcAwgb0wHQYDVR0OBBYEFNLE
|
||||
sNKR1EwRcbNhyz2h/t2oatTjMIGNBgNVHSMEgYUwgYKAFNLEsNKR1EwRcbNhyz2h
|
||||
/t2oatTjoWekZTBjMQswCQYDVQQGEwJVUzEhMB8GA1UEChMYVGhlIEdvIERhZGR5
|
||||
IEdyb3VwLCBJbmMuMTEwLwYDVQQLEyhHbyBEYWRkeSBDbGFzcyAyIENlcnRpZmlj
|
||||
YXRpb24gQXV0aG9yaXR5ggEAMAwGA1UdEwQFMAMBAf8wDQYJKoZIhvcNAQEFBQAD
|
||||
ggEBADJL87LKPpH8EsahB4yOd6AzBhRckB4Y9wimPQoZ+YeAEW5p5JYXMP80kWNy
|
||||
OO7MHAGjHZQopDH2esRU1/blMVgDoszOYtuURXO1v0XJJLXVggKtI3lpjbi2Tc7P
|
||||
TMozI+gciKqdi0FuFskg5YmezTvacPd+mSYgFFQlq25zheabIZ0KbIIOqPjCDPoQ
|
||||
HmyW74cNxA9hi63ugyuV+I6ShHI56yDqg+2DzZduCLzrTia2cyvk0/ZM/iZx4mER
|
||||
dEr/VxqHD3VILs9RaRegAhJhldXRQLIQTO7ErBBDpqWeCtWVYpoNz4iCxTIM5Cuf
|
||||
ReYNnyicsbkqWletNw+vHX/bvZ8=
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, ST=Arizona, L=Scottsdale, O=GoDaddy.com, Inc., OU=http://certificates.godaddy.com/repository<http://certificates.godaddy.com/repository>, CN=Go Daddy Secure Certification Authority/serialNumber=07969287
|
||||
# Issuer: C=US, O=The Go Daddy Group, Inc., OU=Go Daddy Class 2 Certification Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIE3jCCA8agAwIBAgICAwEwDQYJKoZIhvcNAQEFBQAwYzELMAkGA1UEBhMCVVMx
|
||||
ITAfBgNVBAoTGFRoZSBHbyBEYWRkeSBHcm91cCwgSW5jLjExMC8GA1UECxMoR28g
|
||||
RGFkZHkgQ2xhc3MgMiBDZXJ0aWZpY2F0aW9uIEF1dGhvcml0eTAeFw0wNjExMTYw
|
||||
MTU0MzdaFw0yNjExMTYwMTU0MzdaMIHKMQswCQYDVQQGEwJVUzEQMA4GA1UECBMH
|
||||
QXJpem9uYTETMBEGA1UEBxMKU2NvdHRzZGFsZTEaMBgGA1UEChMRR29EYWRkeS5j
|
||||
b20sIEluYy4xMzAxBgNVBAsTKmh0dHA6Ly9jZXJ0aWZpY2F0ZXMuZ29kYWRkeS5j
|
||||
b20vcmVwb3NpdG9yeTEwMC4GA1UEAxMnR28gRGFkZHkgU2VjdXJlIENlcnRpZmlj
|
||||
YXRpb24gQXV0aG9yaXR5MREwDwYDVQQFEwgwNzk2OTI4NzCCASIwDQYJKoZIhvcN
|
||||
AQEBBQADggEPADCCAQoCggEBAMQt1RWMnCZM7DI161+4WQFapmGBWTtwY6vj3D3H
|
||||
KrjJM9N55DrtPDAjhI6zMBS2sofDPZVUBJ7fmd0LJR4h3mUpfjWoqVTr9vcyOdQm
|
||||
VZWt7/v+WIbXnvQAjYwqDL1CBM6nPwT27oDyqu9SoWlm2r4arV3aLGbqGmu75RpR
|
||||
SgAvSMeYddi5Kcju+GZtCpyz8/x4fKL4o/K1w/O5epHBp+YlLpyo7RJlbmr2EkRT
|
||||
cDCVw5wrWCs9CHRK8r5RsL+H0EwnWGu1NcWdrxcx+AuP7q2BNgWJCJjPOq8lh8BJ
|
||||
6qf9Z/dFjpfMFDniNoW1fho3/Rb2cRGadDAW/hOUoz+EDU8CAwEAAaOCATIwggEu
|
||||
MB0GA1UdDgQWBBT9rGEyk2xF1uLuhV+auud2mWjM5zAfBgNVHSMEGDAWgBTSxLDS
|
||||
kdRMEXGzYcs9of7dqGrU4zASBgNVHRMBAf8ECDAGAQH/AgEAMDMGCCsGAQUFBwEB
|
||||
BCcwJTAjBggrBgEFBQcwAYYXaHR0cDovL29jc3AuZ29kYWRkeS5jb20wRgYDVR0f
|
||||
BD8wPTA7oDmgN4Y1aHR0cDovL2NlcnRpZmljYXRlcy5nb2RhZGR5LmNvbS9yZXBv
|
||||
c2l0b3J5L2dkcm9vdC5jcmwwSwYDVR0gBEQwQjBABgRVHSAAMDgwNgYIKwYBBQUH
|
||||
AgEWKmh0dHA6Ly9jZXJ0aWZpY2F0ZXMuZ29kYWRkeS5jb20vcmVwb3NpdG9yeTAO
|
||||
BgNVHQ8BAf8EBAMCAQYwDQYJKoZIhvcNAQEFBQADggEBANKGwOy9+aG2Z+5mC6IG
|
||||
OgRQjhVyrEp0lVPLN8tESe8HkGsz2ZbwlFalEzAFPIUyIXvJxwqoJKSQ3kbTJSMU
|
||||
A2fCENZvD117esyfxVgqwcSeIaha86ykRvOe5GPLL5CkKSkB2XIsKd83ASe8T+5o
|
||||
0yGPwLPk9Qnt0hCqU7S+8MxZC9Y7lhyVJEnfzuz9p0iRFEUOOjZv2kWzRaJBydTX
|
||||
RE4+uXR21aITVSzGh6O1mawGhId/dQb8vxRMDsxuxN89txJx9OjxUUAiKEngHUuH
|
||||
qDTMBqLdElrRhjZkAzVvb3du6/KFUJheqwNTrZEjYx8WnM25sgVjOuH0aBsXBTWV
|
||||
U+4=
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, ST=Arizona, L=Scottsdale, O=GoDaddy.com, Inc., CN=Go Daddy Root Certificate Authority - G2
|
||||
# Issuer: C=US, ST=Arizona, L=Scottsdale, O=GoDaddy.com, Inc., CN=Go Daddy Root Certificate Authority - G2
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDxTCCAq2gAwIBAgIBADANBgkqhkiG9w0BAQsFADCBgzELMAkGA1UEBhMCVVMx
|
||||
EDAOBgNVBAgTB0FyaXpvbmExEzARBgNVBAcTClNjb3R0c2RhbGUxGjAYBgNVBAoT
|
||||
EUdvRGFkZHkuY29tLCBJbmMuMTEwLwYDVQQDEyhHbyBEYWRkeSBSb290IENlcnRp
|
||||
ZmljYXRlIEF1dGhvcml0eSAtIEcyMB4XDTA5MDkwMTAwMDAwMFoXDTM3MTIzMTIz
|
||||
NTk1OVowgYMxCzAJBgNVBAYTAlVTMRAwDgYDVQQIEwdBcml6b25hMRMwEQYDVQQH
|
||||
EwpTY290dHNkYWxlMRowGAYDVQQKExFHb0RhZGR5LmNvbSwgSW5jLjExMC8GA1UE
|
||||
AxMoR28gRGFkZHkgUm9vdCBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkgLSBHMjCCASIw
|
||||
DQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAL9xYgjx+lk09xvJGKP3gElY6SKD
|
||||
E6bFIEMBO4Tx5oVJnyfq9oQbTqC023CYxzIBsQU+B07u9PpPL1kwIuerGVZr4oAH
|
||||
/PMWdYA5UXvl+TW2dE6pjYIT5LY/qQOD+qK+ihVqf94Lw7YZFAXK6sOoBJQ7Rnwy
|
||||
DfMAZiLIjWltNowRGLfTshxgtDj6AozO091GB94KPutdfMh8+7ArU6SSYmlRJQVh
|
||||
GkSBjCypQ5Yj36w6gZoOKcUcqeldHraenjAKOc7xiID7S13MMuyFYkMlNAJWJwGR
|
||||
tDtwKj9useiciAF9n9T521NtYJ2/LOdYq7hfRvzOxBsDPAnrSTFcaUaz4EcCAwEA
|
||||
AaNCMEAwDwYDVR0TAQH/BAUwAwEB/zAOBgNVHQ8BAf8EBAMCAQYwHQYDVR0OBBYE
|
||||
FDqahQcQZyi27/a9BUFuIMGU2g/eMA0GCSqGSIb3DQEBCwUAA4IBAQCZ21151fmX
|
||||
WWcDYfF+OwYxdS2hII5PZYe096acvNjpL9DbWu7PdIxztDhC2gV7+AJ1uP2lsdeu
|
||||
9tfeE8tTEH6KRtGX+rcuKxGrkLAngPnon1rpN5+r5N9ss4UXnT3ZJE95kTXWXwTr
|
||||
gIOrmgIttRD02JDHBHNA7XIloKmf7J6raBKZV8aPEjoJpL1E/QYVN8Gb5DKj7Tjo
|
||||
2GTzLH4U/ALqn83/B2gX2yKQOC16jdFU8WnjXzPKej17CuPKf1855eJ1usV2GDPO
|
||||
LPAvTK33sefOT6jEm0pUBsV/fdUID+Ic/n4XuKxe9tQWskMJDE32p2u0mYRlynqI
|
||||
4uJEvlz36hz1
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=GeoTrust Inc., CN=GeoTrust Global CA
|
||||
# Issuer: C=US, O=GeoTrust Inc., CN=GeoTrust Global CA
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDVDCCAjygAwIBAgIDAjRWMA0GCSqGSIb3DQEBBQUAMEIxCzAJBgNVBAYTAlVT
|
||||
MRYwFAYDVQQKEw1HZW9UcnVzdCBJbmMuMRswGQYDVQQDExJHZW9UcnVzdCBHbG9i
|
||||
YWwgQ0EwHhcNMDIwNTIxMDQwMDAwWhcNMjIwNTIxMDQwMDAwWjBCMQswCQYDVQQG
|
||||
EwJVUzEWMBQGA1UEChMNR2VvVHJ1c3QgSW5jLjEbMBkGA1UEAxMSR2VvVHJ1c3Qg
|
||||
R2xvYmFsIENBMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA2swYYzD9
|
||||
9BcjGlZ+W988bDjkcbd4kdS8odhM+KhDtgPpTSEHCIjaWC9mOSm9BXiLnTjoBbdq
|
||||
fnGk5sRgprDvgOSJKA+eJdbtg/OtppHHmMlCGDUUna2YRpIuT8rxh0PBFpVXLVDv
|
||||
iS2Aelet8u5fa9IAjbkU+BQVNdnARqN7csiRv8lVK83Qlz6cJmTM386DGXHKTubU
|
||||
1XupGc1V3sjs0l44U+VcT4wt/lAjNvxm5suOpDkZALeVAjmRCw7+OC7RHQWa9k0+
|
||||
bw8HHa8sHo9gOeL6NlMTOdReJivbPagUvTLrGAMoUgRx5aszPeE4uwc2hGKceeoW
|
||||
MPRfwCvocWvk+QIDAQABo1MwUTAPBgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBTA
|
||||
ephojYn7qwVkDBF9qn1luMrMTjAfBgNVHSMEGDAWgBTAephojYn7qwVkDBF9qn1l
|
||||
uMrMTjANBgkqhkiG9w0BAQUFAAOCAQEANeMpauUvXVSOKVCUn5kaFOSPeCpilKIn
|
||||
Z57QzxpeR+nBsqTP3UEaBU6bS+5Kb1VSsyShNwrrZHYqLizz/Tt1kL/6cdjHPTfS
|
||||
tQWVYrmm3ok9Nns4d0iXrKYgjy6myQzCsplFAMfOEVEiIuCl6rYVSAlk6l5PdPcF
|
||||
PseKUgzbFbS9bZvlxrFUaKnjaZC2mqUPuLk/IH2uSrW4nOQdtqvmlKXBx4Ot2/Un
|
||||
hw4EbNX/3aBd7YdStysVAq45pmp06drE57xNNB6pXE0zX5IJL4hmXXeXxx12E6nV
|
||||
5fEWCRE11azbJHFwLJhWC9kXtNHjUStedejV0NxPNO3CBWaAocvmMw==
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=GeoTrust Inc., CN=GeoTrust Primary Certification Authority
|
||||
# Issuer: C=US, O=GeoTrust Inc., CN=GeoTrust Primary Certification Authority
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIDfDCCAmSgAwIBAgIQGKy1av1pthU6Y2yv2vrEoTANBgkqhkiG9w0BAQUFADBY
|
||||
MQswCQYDVQQGEwJVUzEWMBQGA1UEChMNR2VvVHJ1c3QgSW5jLjExMC8GA1UEAxMo
|
||||
R2VvVHJ1c3QgUHJpbWFyeSBDZXJ0aWZpY2F0aW9uIEF1dGhvcml0eTAeFw0wNjEx
|
||||
MjcwMDAwMDBaFw0zNjA3MTYyMzU5NTlaMFgxCzAJBgNVBAYTAlVTMRYwFAYDVQQK
|
||||
Ew1HZW9UcnVzdCBJbmMuMTEwLwYDVQQDEyhHZW9UcnVzdCBQcmltYXJ5IENlcnRp
|
||||
ZmljYXRpb24gQXV0aG9yaXR5MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKC
|
||||
AQEAvrgVe//UfH1nrYNke8hCUy3f9oQIIGHWAVlqnEQRr+92/ZV+zmEwu3qDXwK9
|
||||
AWbK7hWNb6EwnL2hhZ6UOvNWiAAxz9juapYC2e0DjPt1befquFUWBRaa9OBesYjA
|
||||
ZIVcFU2Ix7e64HXprQU9nceJSOC7KMgD4TCTZF5SwFlwIjVXiIrxlQqD17wxcwE0
|
||||
7e9GceBrAqg1cmuXm2bgyxx5X9gaBGgeRwLmnWDiNpcB3841kt++Z8dtd1k7j53W
|
||||
kBWUvEI0EME5+bEnPn7WinXFsq+W06Lem+SYvn3h6YGttm/81w7a4DSwDRp35+MI
|
||||
mO9Y+pyEtzavwt+s0vQQBnBxNQIDAQABo0IwQDAPBgNVHRMBAf8EBTADAQH/MA4G
|
||||
A1UdDwEB/wQEAwIBBjAdBgNVHQ4EFgQULNVQQZcVi/CPNmFbSvtr2ZnJM5IwDQYJ
|
||||
KoZIhvcNAQEFBQADggEBAFpwfyzdtzRP9YZRqSa+S7iq8XEN3GHHoOo0Hnp3DwQ1
|
||||
6CePbJC/kRYkRj5KTs4rFtULUh38H2eiAkUxT87z+gOneZ1TatnaYzr4gNfTmeGl
|
||||
4b7UVXGYNTq+k+qurUKykG/g/CFNNWMziUnWm07Kx+dOCQD32sfvmWKZd7aVIl6K
|
||||
oKv0uHiYyjgZmclynnjNS6yvGaBzEi38wkG6gZHaFloxt/m0cYASSJlyc1pZU8Fj
|
||||
UjPtp8nSOQJw+uCxQmYpqptR7TBUIhRf2asdweSU8Pj1K/fqynhG1riR/aYNKxoU
|
||||
AT6A8EKglQdebc3MS6RFjasS6LPeWuWgfOgPIh1a6Vk=
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: C=US, O=The Go Daddy Group, Inc., OU=Go Daddy Class 2 Certification Authority
|
||||
# Issuer: L=ValiCert Validation Network, O=ValiCert, Inc., OU=ValiCert Class 2 Policy Validation Authority, CN=http://www.valicert.com//emailAddress=info@valicert.com<http://www.valicert.com//emailAddress=info@valicert.com>
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIE+zCCBGSgAwIBAgICAQ0wDQYJKoZIhvcNAQEFBQAwgbsxJDAiBgNVBAcTG1Zh
|
||||
bGlDZXJ0IFZhbGlkYXRpb24gTmV0d29yazEXMBUGA1UEChMOVmFsaUNlcnQsIElu
|
||||
Yy4xNTAzBgNVBAsTLFZhbGlDZXJ0IENsYXNzIDIgUG9saWN5IFZhbGlkYXRpb24g
|
||||
QXV0aG9yaXR5MSEwHwYDVQQDExhodHRwOi8vd3d3LnZhbGljZXJ0LmNvbS8xIDAe
|
||||
BgkqhkiG9w0BCQEWEWluZm9AdmFsaWNlcnQuY29tMB4XDTA0MDYyOTE3MDYyMFoX
|
||||
DTI0MDYyOTE3MDYyMFowYzELMAkGA1UEBhMCVVMxITAfBgNVBAoTGFRoZSBHbyBE
|
||||
YWRkeSBHcm91cCwgSW5jLjExMC8GA1UECxMoR28gRGFkZHkgQ2xhc3MgMiBDZXJ0
|
||||
aWZpY2F0aW9uIEF1dGhvcml0eTCCASAwDQYJKoZIhvcNAQEBBQADggENADCCAQgC
|
||||
ggEBAN6d1+pXGEmhW+vXX0iG6r7d/+TvZxz0ZWizV3GgXne77ZtJ6XCAPVYYYwhv
|
||||
2vLM0D9/AlQiVBDYsoHUwHU9S3/Hd8M+eKsaA7Ugay9qK7HFiH7Eux6wwdhFJ2+q
|
||||
N1j3hybX2C32qRe3H3I2TqYXP2WYktsqbl2i/ojgC95/5Y0V4evLOtXiEqITLdiO
|
||||
r18SPaAIBQi2XKVlOARFmR6jYGB0xUGlcmIbYsUfb18aQr4CUWWoriMYavx4A6lN
|
||||
f4DD+qta/KFApMoZFv6yyO9ecw3ud72a9nmYvLEHZ6IVDd2gWMZEewo+YihfukEH
|
||||
U1jPEX44dMX4/7VpkI+EdOqXG68CAQOjggHhMIIB3TAdBgNVHQ4EFgQU0sSw0pHU
|
||||
TBFxs2HLPaH+3ahq1OMwgdIGA1UdIwSByjCBx6GBwaSBvjCBuzEkMCIGA1UEBxMb
|
||||
VmFsaUNlcnQgVmFsaWRhdGlvbiBOZXR3b3JrMRcwFQYDVQQKEw5WYWxpQ2VydCwg
|
||||
SW5jLjE1MDMGA1UECxMsVmFsaUNlcnQgQ2xhc3MgMiBQb2xpY3kgVmFsaWRhdGlv
|
||||
biBBdXRob3JpdHkxITAfBgNVBAMTGGh0dHA6Ly93d3cudmFsaWNlcnQuY29tLzEg
|
||||
MB4GCSqGSIb3DQEJARYRaW5mb0B2YWxpY2VydC5jb22CAQEwDwYDVR0TAQH/BAUw
|
||||
AwEB/zAzBggrBgEFBQcBAQQnMCUwIwYIKwYBBQUHMAGGF2h0dHA6Ly9vY3NwLmdv
|
||||
ZGFkZHkuY29tMEQGA1UdHwQ9MDswOaA3oDWGM2h0dHA6Ly9jZXJ0aWZpY2F0ZXMu
|
||||
Z29kYWRkeS5jb20vcmVwb3NpdG9yeS9yb290LmNybDBLBgNVHSAERDBCMEAGBFUd
|
||||
IAAwODA2BggrBgEFBQcCARYqaHR0cDovL2NlcnRpZmljYXRlcy5nb2RhZGR5LmNv
|
||||
bS9yZXBvc2l0b3J5MA4GA1UdDwEB/wQEAwIBBjANBgkqhkiG9w0BAQUFAAOBgQC1
|
||||
QPmnHfbq/qQaQlpE9xXUhUaJwL6e4+PrxeNYiY+Sn1eocSxI0YGyeR+sBjUZsE4O
|
||||
WBsUs5iB0QQeyAfJg594RAoYC5jcdnplDQ1tgMQLARzLrUc+cb53S8wGd9D0Vmsf
|
||||
SxOaFIqII6hR8INMqzW/Rn453HWkrugp++85j09VZw==
|
||||
-----END CERTIFICATE-----
|
||||
# Subject: L=ValiCert Validation Network, O=ValiCert, Inc., OU=ValiCert Class 2 Policy Validation Authority, CN=http://www.valicert.com//emailAddress=info@valicert.com<http://www.valicert.com//emailAddress=info@valicert.com>
|
||||
# Issuer: L=ValiCert Validation Network, O=ValiCert, Inc., OU=ValiCert Class 2 Policy Validation Authority, CN=http://www.valicert.com//emailAddress=info@valicert.com<http://www.valicert.com//emailAddress=info@valicert.com>
|
||||
-----BEGIN CERTIFICATE-----
|
||||
MIIC5zCCAlACAQEwDQYJKoZIhvcNAQEFBQAwgbsxJDAiBgNVBAcTG1ZhbGlDZXJ0
|
||||
IFZhbGlkYXRpb24gTmV0d29yazEXMBUGA1UEChMOVmFsaUNlcnQsIEluYy4xNTAz
|
||||
BgNVBAsTLFZhbGlDZXJ0IENsYXNzIDIgUG9saWN5IFZhbGlkYXRpb24gQXV0aG9y
|
||||
aXR5MSEwHwYDVQQDExhodHRwOi8vd3d3LnZhbGljZXJ0LmNvbS8xIDAeBgkqhkiG
|
||||
9w0BCQEWEWluZm9AdmFsaWNlcnQuY29tMB4XDTk5MDYyNjAwMTk1NFoXDTE5MDYy
|
||||
NjAwMTk1NFowgbsxJDAiBgNVBAcTG1ZhbGlDZXJ0IFZhbGlkYXRpb24gTmV0d29y
|
||||
azEXMBUGA1UEChMOVmFsaUNlcnQsIEluYy4xNTAzBgNVBAsTLFZhbGlDZXJ0IENs
|
||||
YXNzIDIgUG9saWN5IFZhbGlkYXRpb24gQXV0aG9yaXR5MSEwHwYDVQQDExhodHRw
|
||||
Oi8vd3d3LnZhbGljZXJ0LmNvbS8xIDAeBgkqhkiG9w0BCQEWEWluZm9AdmFsaWNl
|
||||
cnQuY29tMIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDOOnHK5avIWZJV16vY
|
||||
dA757tn2VUdZZUcOBVXc65g2PFxTXdMwzzjsvUGJ7SVCCSRrCl6zfN1SLUzm1NZ9
|
||||
WlmpZdRJEy0kTRxQb7XBhVQ7/nHk01xC+YDgkRoKWzk2Z/M/VXwbP7RfZHM047QS
|
||||
v4dk+NoS/zcnwbNDu+97bi5p9wIDAQABMA0GCSqGSIb3DQEBBQUAA4GBADt/UG9v
|
||||
UJSZSWI4OB9L+KXIPqeCgfYrx+jFzug6EILLGACOTb2oWH+heQC1u+mNr0HZDzTu
|
||||
IYEZoDJJKPTEjlbVUjP9UNV+mWwD5MlM/Mtsq2azSiGM5bUMMj4QssxsodyamEwC
|
||||
W/POuZ6lcg5Ktz885hZo+L7tdEy8W9ViH0Pd
|
||||
-----END CERTIFICATE-----
|
||||
|
||||
|
53
resources/lib/dropbox/util.py
Normal file
53
resources/lib/dropbox/util.py
Normal file
@ -0,0 +1,53 @@
|
||||
import os
|
||||
|
||||
class AnalyzeFileObjBug(Exception):
|
||||
msg = ("\n"
|
||||
"Expected file object to have %d bytes, instead we read %d bytes.\n"
|
||||
"File size detection may have failed (see dropbox.util.AnalyzeFileObj)\n")
|
||||
def __init__(self, expected, actual):
|
||||
self.expected = expected
|
||||
self.actual = actual
|
||||
|
||||
def __str__(self):
|
||||
return self.msg % (self.expected, self.actual)
|
||||
|
||||
def analyze_file_obj(obj):
|
||||
""" Get the size and contents of a file-like object.
|
||||
Returns: (size, raw_data)
|
||||
size: The amount of data waiting to be read
|
||||
raw_data: If not None, the entire contents of the stream (as a string).
|
||||
None if the stream should be read() in chunks.
|
||||
"""
|
||||
pos = 0
|
||||
if hasattr(obj, 'tell'):
|
||||
pos = obj.tell()
|
||||
|
||||
# Handle cStringIO and StringIO
|
||||
if hasattr(obj, 'getvalue'):
|
||||
# Why using getvalue() makes sense:
|
||||
# For StringIO, this string is pre-computed anyway by read().
|
||||
# For cStringIO, getvalue() is the only way
|
||||
# to determine the length without read()'ing the whole thing.
|
||||
raw_data = obj.getvalue()
|
||||
if pos == 0:
|
||||
return (len(raw_data), raw_data)
|
||||
else:
|
||||
# We could return raw_data[pos:], but that could drastically
|
||||
# increase memory usage. Better to read it block at a time.
|
||||
size = max(0, len(raw_data) - pos)
|
||||
return (size, None)
|
||||
|
||||
# Handle real files
|
||||
if hasattr(obj, 'fileno'):
|
||||
size = max(0, os.fstat(obj.fileno()).st_size - pos)
|
||||
return (size, None)
|
||||
|
||||
# User-defined object with len()
|
||||
if hasattr(obj, '__len__'):
|
||||
size = max(0, len(obj) - pos)
|
||||
return (size, None)
|
||||
|
||||
# We don't know what kind of stream this is.
|
||||
# To determine the size, we must read the whole thing.
|
||||
raw_data = obj.read()
|
||||
return (len(raw_data), raw_data)
|
@ -7,6 +7,9 @@ __Addon = xbmcaddon.Addon(__addon_id__)
|
||||
def data_dir():
|
||||
return __Addon.getAddonInfo('profile')
|
||||
|
||||
def addon_dir():
|
||||
return __Addon.getAddonInfo('path')
|
||||
|
||||
def log(message,loglevel=xbmc.LOGNOTICE):
|
||||
xbmc.log(encode(__addon_id__ + ": " + message),level=loglevel)
|
||||
|
||||
|
173
resources/lib/vfs.py
Normal file
173
resources/lib/vfs.py
Normal file
@ -0,0 +1,173 @@
|
||||
import utils as utils
|
||||
import xbmc
|
||||
import xbmcvfs
|
||||
import xbmcgui
|
||||
import sys
|
||||
from dropbox import client, rest, session
|
||||
|
||||
APP_KEY = 'f5wlmek6aoriqax'
|
||||
APP_SECRET = 'b1461sje1kxgzet'
|
||||
|
||||
class Vfs:
|
||||
root_path = None
|
||||
|
||||
def set_root(self,rootString):
|
||||
old_root = self.root_path
|
||||
self.root_path = rootString
|
||||
|
||||
#fix slashes
|
||||
self.root_path = self.root_path.replace("\\","/")
|
||||
|
||||
#check if trailing slash is included
|
||||
if(self.root_path[-1:] != "/"):
|
||||
self.root_path = self.root_path + "/"
|
||||
|
||||
#return the old root
|
||||
return old_root
|
||||
|
||||
def listdir(self,directory):
|
||||
return {}
|
||||
|
||||
def mkdir(self,directory):
|
||||
return True
|
||||
|
||||
def put(self,source,dest):
|
||||
return True
|
||||
|
||||
def getFile(self,source):
|
||||
return True
|
||||
|
||||
def rmdir(self,directory):
|
||||
return True
|
||||
|
||||
def exists(self,aFile):
|
||||
return True
|
||||
|
||||
class XBMCFileSystem(Vfs):
|
||||
|
||||
def listdir(self,directory):
|
||||
return xbmcvfs.listdir(directory)
|
||||
|
||||
def mkdir(self,directory):
|
||||
return xbmcvfs.mkdir(directory)
|
||||
|
||||
def put(self,source,dest):
|
||||
return xbmcvfs.copy(source,dest)
|
||||
|
||||
def rmdir(self,directory):
|
||||
return xbmcvfs.rmdir(directory,True)
|
||||
|
||||
def exists(self,aFile):
|
||||
return xbmcvfs.exists(aFile)
|
||||
|
||||
class DropboxFileSystem(Vfs):
|
||||
client = None
|
||||
|
||||
def __init__(self):
|
||||
user_token_key,user_token_secret = self.getToken()
|
||||
|
||||
sess = session.DropboxSession(APP_KEY,APP_SECRET,"app_folder")
|
||||
|
||||
if(user_token_key == '' and user_token_secret == ''):
|
||||
token = sess.obtain_request_token()
|
||||
url = sess.build_authorize_url(token)
|
||||
|
||||
#print url in log
|
||||
utils.log("Authorize URL: " + url)
|
||||
xbmcgui.Dialog().ok(utils.getString(30010),utils.getString(30056),utils.getString(30057))
|
||||
|
||||
#if user authorized this will work
|
||||
user_token = sess.obtain_access_token(token)
|
||||
self.setToken(user_token.key,user_token.secret)
|
||||
|
||||
else:
|
||||
sess.set_token(user_token_key,user_token_secret)
|
||||
|
||||
self.client = client.DropboxClient(sess)
|
||||
utils.log(str(self.client.account_info()))
|
||||
|
||||
def listdir(self,directory):
|
||||
if(self.client != None and self.exists(directory)):
|
||||
files = []
|
||||
dirs = []
|
||||
metadata = self.client.metadata(directory)
|
||||
|
||||
for aFile in metadata['contents']:
|
||||
if(aFile['is_dir']):
|
||||
dirs.append(aFile['path'][len(directory):])
|
||||
else:
|
||||
files.append(aFile['path'][len(directory):])
|
||||
|
||||
return [dirs,files]
|
||||
else:
|
||||
return [[],[]]
|
||||
|
||||
|
||||
def mkdir(self,directory):
|
||||
if(self.client != None):
|
||||
if(not self.exists(directory)):
|
||||
self.client.file_create_folder(directory)
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
def rmdir(self,directory):
|
||||
if(self.client != None and self.exists(directory)):
|
||||
self.client.file_delete(directory)
|
||||
else:
|
||||
return False
|
||||
|
||||
def exists(self,aFile):
|
||||
if(self.client != None):
|
||||
try:
|
||||
meta_data = self.client.metadata(aFile)
|
||||
#if we make it here the file does exist
|
||||
return True
|
||||
except:
|
||||
return False
|
||||
else:
|
||||
return False
|
||||
|
||||
def put(self,source,dest):
|
||||
if(self.client != None):
|
||||
f = open(source,'rb')
|
||||
try:
|
||||
response = self.client.put_file(dest,f,True)
|
||||
return True
|
||||
except:
|
||||
#if we have an exception retry
|
||||
retry = True
|
||||
return self.put(source,dest)
|
||||
else:
|
||||
return False
|
||||
|
||||
def get_file(self,source,dest):
|
||||
if(self.client != None):
|
||||
#write the file locally
|
||||
out = open(dest,'wb')
|
||||
f = self.client.get_file(source).read()
|
||||
out.write(f)
|
||||
out.close()
|
||||
else:
|
||||
return False
|
||||
def setToken(self,key,secret):
|
||||
#write the token files
|
||||
token_file = open(xbmc.translatePath(utils.data_dir() + "tokens.txt"),'w')
|
||||
token_file.write("%s|%s" % (key,secret))
|
||||
token_file.close()
|
||||
|
||||
def getToken(self):
|
||||
#get tokens, if they exist
|
||||
if(xbmcvfs.exists(xbmc.translatePath(utils.data_dir() + "tokens.txt"))):
|
||||
token_file = open(xbmc.translatePath(utils.data_dir() + "tokens.txt"))
|
||||
key,secret = token_file.read().split('|')
|
||||
token_file.close()
|
||||
|
||||
return [key,secret]
|
||||
else:
|
||||
return ["",""]
|
||||
|
||||
|
||||
|
||||
|
||||
|
@ -1,7 +1,7 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||
<settings>
|
||||
<category id="general" label="30011">
|
||||
<setting id="remote_selection" type="enum" lvalues="30018|30019" default="0" label="30025"/>
|
||||
<setting id="remote_selection" type="enum" lvalues="30018|30019|30027" default="0" label="30025"/>
|
||||
<setting id="remote_path_2" type="text" label="30024" default="" visible="eq(-1,1)" />
|
||||
<setting id="remote_path" type="folder" label="30020" visible="eq(-2,0)" />
|
||||
<setting id="backup_rotation" type="number" label="30026" default="0" />
|
||||
|
Loading…
Reference in New Issue
Block a user