# HG changeset patch # User Bernhard Reiter # Date 1448955838 -3600 # Node ID 0b65eb9ecb7992daa3eae3b2071cb054231e2972 # Parent 99c68ebfb3b9a81790f970d1b524576174646c95 Cleanup: Added README.creole. Removed old file. diff -r 99c68ebfb3b9 -r 0b65eb9ecb79 README.creole --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/README.creole Tue Dec 01 08:43:58 2015 +0100 @@ -0,0 +1,20 @@ += Roundup Issue Collector + +Grab and display data from a http://roundup-tracker.org/ instance. + +It is Free Software, check out the header files. + +=== Notes + +When migrating to 3:99c68ebfb3b9, Nov 30 17:46:22 2015 +you need to add the print statements for the content-type header +to all of your cgi scripts. + +=== Prerequisites + +Python v3, with build-in sqlite3 module. + +=== Included + +http://d3js.org/ initially used with 3.5.5 + """Library released under BSD license. Copyright 2015 Mike Bostock." diff -r 99c68ebfb3b9 -r 0b65eb9ecb79 collect_issues --- a/collect_issues Mon Nov 30 17:46:22 2015 +0100 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,123 +0,0 @@ -#!/usr/bin/env python3 - -""" Fetch issues from a roundup-tracker and save them in a databse. - -author: Sascha L. Teichmann -author: Bernhard Reiter -author: Sean Engelhardt - -(c) 2010,2015 by Intevation GmbH - -This is Free Software unter the terms of the -GNU GENERAL PUBLIC LICENSE Version 3 or later. -See http://www.gnu.org/licenses/gpl-3.0.txt for details -""" - -import http.cookiejar -import urllib.parse -import urllib.request -import csv -import io -import sqlite3 as db -import os -import roundup_content_data as rcd - - -# Types of URLS -BASE_URL = "http://10.42.7.199:8917/demo/" -SEARCH_URL = "issue?@action=export_csv&@columns=title,priority&@filter=status&@pagesize=50&@startwith=0&status=-1,1,2,3,4,5,6,7" -REPORT_URL = BASE_URL + SEARCH_URL - - -LOGIN_PARAMETERS = ( - ("__login_name", "demo"), - ("__login_password", "demo"), - ("@action", "Login"), - ) - - -# SQL-COMMANDS - - -def connect_to_server(): - enc_data = urllib.parse.urlencode(LOGIN_PARAMETERS).encode() - cj = http.cookiejar.CookieJar() - opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj)) - req = urllib.request.Request(url=BASE_URL, data=enc_data) - opener.open(req) - return opener - - -def get_issues_as_csv(opener): - csv_req = urllib.request.Request(url=REPORT_URL) - f = opener.open(csv_req) - - reader = csv.DictReader(io.TextIOWrapper(f)) - - return reader - - -def check_create_database(database_file): - if not os.path.isfile(database_file): - con = None - cur = None - try: - con = db.connect(database_file) - cur = con.cursor() - try: - cur.execute(rcd.CREATE_DB) - con.commit() - os.chmod(rcd.DATABASE_FILE, 0o744) - except: - con.rollback() - raise - finally: - if cur: - cur.close() - if con: - con.close() - - -def issues_to_quantities(rows): - quantities = [0] * len(rcd.COLUMNS) - for row in rows: - quantities[int(row["priority"]) - 1] += 1 - return quantities - - -def save_issues_to_db(quantities, database_file): - check_create_database(database_file) - - cur = None - con = None - - try: - con = db.connect(database_file) - cur = con.cursor() - try: - cur.execute(rcd.INSERT_NEW, quantities) - con.commit() - except: - con.rollback() - raise - finally: - if cur: - cur.close() - if con: - con.close() - - -def save_stats_in_db(): - try: - opener = connect_to_server() - current_issues_csv = get_issues_as_csv(opener) - opener.close() - - quantities = issues_to_quantities(current_issues_csv) - - save_issues_to_db(quantities, rcd.DATABASE_FILE) - except urllib.error.URLError: - print("No Valid Connection to server : " + BASE_URL) - - -save_stats_in_db() \ No newline at end of file