Hello testers, Kiwi TCMS has taken on a brave new mission! We would like to transform the testing process by making it more organized, transparent & accountable for everyone on your team. Our goal is to improve engineering productivity and participation in testing. The following blog post outlines how we would like to achieve this and what goals we put before ourselves for this year.
Last year we took on the challenge to bring a legacy code base up to modern coding standard. We did not complete that effort but made very good progress along the way. This is not a small task and that's why our team will continue with it this year.
postToURL()
and jQ.ajax()
on the front-endThere are 59 templates remaining to be converted to a modern look and feel. Along with them comes more refactoring and even redesign of the existing pages and the workflow around them. Together with refactoring this will make Kiwi TCMS easier to use and also to maintain.
We are planning to remove the existing reports feature because they are not well designed. We will re-implement existing functionality that our community finds useful, add new types of reports (incl. nicer graphics and UI) and make it possible for the reporting sub-system to be more easily extendable.
Phase out is planned to begin after 1st March 2019! Until then we are looking for your feedback. Please comment in Issue #657!
These will make it easier to collect results from automated test suites into Kiwi TCMS for later analysis. Instead of creating scripts that parse the results and talk to our API you will only have to install an additional package in your test environment and configure the test runner to use it! Automation test results will then appear inside Kiwi TCMS.
If you would like to use such functionality leave your vote inside GitHub issues! In case you would like to write a test-runner plugin you can find the specification here.
Question: Does Kiwi TCMS integrate with JIRA?
Answer: Well, it does. How exactly do you want to integrate?
... silence ...
The following dialog happens every time someone asks me about bug-tracker integration, especially with JIRA. The thing is integration is a specified set of behavior which may or may not be desired in a particular team. As of now Kiwi TCMS is able to open a URL to your bug-tracker with predefined field values, add comments to bug reports and report a simple summary of bugs inside a TestRun.
We recognize this may not be enough and together with the community we really need to define what bug tracker integration means! The broader domain of application lifecycle management tools (of which TCMS is a sub-set) have an integrated bug tracking system. We can add something like this and save you the trouble of using JIRA, however many teams have already invested in integrating their infrastructure or just like other tools. For example we love GitHub issues and our team regularly makes public reports about issues that we find internally!
Developers have their GitHub PR flow and if they have done the job of having unit tests then they will merge only when things are green! This leaves additional testing efforts kind of to the side and doesn't really help with transparency and visibility. I'm not going to mention having an automatically deployed staging environment for every change because very few teams actually have managed to do this effectively.
+1
your wildest ideas in
Issue #700.Speaking of modern engineering flow is your team truly agile? When and how do you plan your testing activities ? Before the devel sprint or afterwards? How many testers take part in refining product backlog and working on user stories?
Similar to GitHub flow lots of teams and open source projects are using Trello to effectively organize their development process. Testing should not be left behind and Kiwi TCMS may be able to help.
devel-test-planning
process for agile teams and what we can do to make this easier for testers.
Please share and +1
your wildest ideas in
Issue #701What makes a test engineer productive when they need to assess product risk and new features, when mapping product requirements documents (PRD) to test plans and test cases, when collaborating on user stories and behavior specification ? What makes developers, product owners, designers and other professionals productive when it comes to dealing with testing ?
For example consider the following workflow:
Later we iterate through the sprints and for each sprint something like this happens:
Devel is also part of testing, right? Product owners, UX and interaction designers as well. Producing quality software product is a team effort!
In every step of the way Kiwi TCMS can provide notification wizards, guidelines and/or documentation for best practices, facilitate tooling, e.g. to write user stories and assess them or map out an exploratory testing session, etc. The list of ideas is virtually endless. We can even go into deep learning, AI and blockchain but honestly who knows how to use them in testing ?
Our team is not quite sure how this goal will look like 3 months from now but we are certain that testing needs to happen first, last and all the time during the entire software development lifecycle. By providing the necessary functionality and tools in Kiwi TCMS we can boost engineering productivity and steer the testing process in your organization into a better, more productive direction which welcomes participation from all engineering groups.
Let's consider another point of view: testing is a creative activity which is benefited by putting your brain into a specific state of mind! For example Gherkin (the Given-When-Then language) has the benefit of forcing you to think about behavior and while doing so you are vocalizing the various roles in the system, what kind of actions are accepted and what sort of result is expected! Many times this will help you remember or discover missing scenarios, edge cases and raise even more questions!
Crazy ideas, brain dumps and +1
as always are welcome in
Issue #703.
Coding alone is not fun! Here's what you can do to help us:
Star
button on our GitHub repository+1
reaction on GitHub issues (top-right corner)We are also looking to expand our core team and the list of occasional contributors. The following are mostly organizational goals:
Our team will be working on areas related to the goals above. A +1
reaction on GitHub issues will help us prioritize what we work on!
Bug fixes and other issues will be occasionally slipped into the stream and pull requests from non-team contributors will be reviewed and merged in a timely fashion.
There is at least 1 full day of work that goes behind the scenes when a new version is officially released: compile changelog, build images and upload them, create blog post and newsletter announcement, share on social media, etc. We also deploy on our own Kiwi TCMS instance as a stop-gap measure before making everything public!
New PyPI tarballs and Docker images will be released every few weeks as we see fit, this has been our standard process. We try to align releases with Django's release schedule and try to cut a new version when there are known security vulnerabilities fixed. However we can't guarantee this will always be the case!
If you are in a hurry and need something quickly the best option is to send a pull request, build your own Docker image from source and maybe consider sponsoring us via Open Collective!
Happy testing!
Hello everyone, in this article I will outline the progress that the Kiwi TCMS team has made towards achieving the goals in our 2018 roadmap (mid-year update here). TLDR; goals are completed at 62%. Refactoring legacy code is showing good results, less so on the front-end side and there are items still in progress!
Status: good progress
Initially CodeClimate reported a "D" rating with 600+ code smells and 600+ duplications and a 1 year estimation to resolve these. We're now down to "C" rating with 171 smells and 203 duplications.
The level of technical debt has dropped from 32.5% down to 17.7% and we have removed around 14000 lines of Python code and 8000 lines of JavaScript code without losing significant functionality.
Checkout the stats for more info!
Status: almost finished
Both pylint and pylint-django have been integrated into our CI workflow. There are even
some custom built plugins that we use. The number of issues reported is down to 100
from 4000+ initially. These are predominantly fixme
comments which are also in parts
of the code that are scheduled for removal and refactoring.
Status: moderate progress
Several views were modified to return pure JSON but we've not done any targeted work to resolve this issue. A number of other views have been removed in favor of using the existing JSON-RPC layer.
This is an internal refactoring effort which isn't very visible from the outside. This is also one of the factors contributing to the high number of removed source code.
Status: no progress
Not much has been done in this area except the occasional refactoring to JSON-RPC.
Status: complete
Status: moderate progress, dropped
All RPC methods have been documented! The rest of the internals will be documented as we go along.
Status: good progress
We still carry around jQuery, jQuery-UI and Handlebars.js. They will be removed once the pages using them are converted to use the Patternfly widgets library.
Status: moderate progress
There are still over 50 HTML templates in tcms/templates/
that need to be
refactored into Patternfly. We've been working on them one at a time and will
focus more on this effort in the next couple of months.
Status: moderate progress
Some of the pages have been converted to use Patternfly. The most important pages that still have a different look and feel are TestPlan view, TestCase view and TestRun view. These are also the hardest to convert because they have lots of tabs/components which pull information from various places. Our goal is to create reusable widgets for the various components (e.g. a list of TestCases) and then include these components into several different templates to minimize code duplication.
Status: moderate progress
A number of JavaScript functions have been refactored and removed during the past few releases but there are still thousands of lines of code left to deal with. This effort is mostly happening in parallel with the Patternfly redesign. We still don't have anything to test front-end JavaScript functionality!
Status: good progress
We are seeing a steady stream of new users registered on https://demo.kiwitcms.org and there are several active contributors on GitHub. Most of our translators are very active and keep their respective languages fresh and up to date!
Kiwi TCMS was represented at OSCAL Tirana, DjangoCon Heidelberg, PyCon Prague, HackConf Sofia, PiterPy St. Petersburg and OpenFest Sofia. We've also been approved for a project stand at FOSDEM 2019 so watch this blog for more news.
Happy testing!
Happy Monday testers! Kiwi TCMS needs your help! We are looking for developers who wish to create plugins for popular test runners that will record test results in Kiwi TCMS! Initially we are looking for plugins for Python's unittest, Django and JUnit!
When working with automated testing you have several components:
test_models.py
which contains
tests for your software;Very often all of the components above live together inside the testing framework
but don't need to. For example the standard unittest
module in Python
provides a test runner but there are also nose
and py.test
and Django provides
its own test runner that knows how to work with the database.
Once you agree to writing a plugin we are going to create a separate GitHub repository where you will be granted write privileges making you an independent contributor to the Kiwi TCMS project!
Design and architecture of the plugin is up to you, following the practices established by the testing framework in question. You will also have to create a test suite for your plugin based on the specification below.
You are expected to use demo.kiwitcms.org while working on the plugin and afterwards. This is known as eating your own dog food!
For Python we provide the tcms-api
module which already takes care of the
communications layer. For other languages you will have to create this layer or
depend on other open source libraries that provide a XML-RPC or JSON-RPC
client!
You can use this gist for inspiration!
Please use the comments section to discuss unclear behavior and missing scenarios!
SUMMARY: Will use configuration file if it exists
GIVEN: the file ~/.tcms.conf exists
WHEN: plugin initializes
THEN: the plugin will log an info message, read the file and
THEN: configure TCMS_API_URL, TCMS_USERNAME, TCMS_PASSWORD
variables with the respective values
SUMMARY: Will use ENVIRONMENT if configuration file doesn't exist
GIVEN: the file ~/.tcms.conf does not exists
WHEN: plugin initializes
THEN: the plugin will read configuration from environment and configure
the following variables/class members:
TCMS_API_URL, TCMS_USERNAME and TCMS_PASSWORD
SUMMARY: Will exit if TCMS_API_URL not configured
GIVEN: TCMS_API_URL variable is empty
WHEN: plugin initializes
THEN: log a warning message and exit
AND: depending on the test runner framework set exist status 1
SUMMARY: Will exit if TCMS_USERNAME not configured
GIVEN: TCMS_USERNAME is empty
WHEN: plugin initializes
THEN: log a warning message and exit
AND: depending on the test runner framework set exist status 1
SUMMARY: Will exit if TCMS_PASSWORD not configured
GIVEN: TCMS_PASSWORD is empty
WHEN: plugin initializes
THEN: log a warning message and exit
AND: depending on the test runner framework set exist status 1
SUMMARY: Will re-use existing TestPlan if configured
GIVEN: TCMS_RUN_ID environment variable is not empty
WHEN: plugin initializes
THEN: this will be the Current_TestRun record to which the plugin is
going to add test execution results
AND: Current_TestPlan document in which the plugin will
search for test cases becomes Current_TestRun.plan
SUMMARY: Will create new TestPlan & TestRun if TCMS_RUN_ID not configured
GIVEN: TCMS_RUN_ID environment variable is empty
THEN: plugin will create a new TestPlan in Kiwi TCMS with attributes:
name='Automated test plan for %(product)'
product='%(product)'
product_version='%(version)'
type='Unit'
WHERE: %(product) is a placeholder for TCMS_PRODUCT==TRAVIS_REPO_SLUG==JOB_NAME
%(version) is a placeholder for TCMS_PRODUCT_VERSION==TRAVIS_COMMIT==TRAVIS_PULL_REQUEST_SHA==GIT_COMMIT
THEN: plugin will crate a new TestRun in Kiwi TCMS with attributes:
summary='Automated test run ....'
plan=Current TestPlan
build='%(build)'
manager=TCMS_USERNAME
WHERE: %(build) is a placeholder for TCMS_BUILD==TRAVIS_BUILD_NUMBER==BUILD_NUMBER
Environment variables are specified in:
https://docs.travis-ci.com/user/environment-variables#default-environment-variables
https://wiki.jenkins.io/display/JENKINS/Building+a+software+project#Buildingasoftwareproject-belowJenkinsSetEnvironmentVariables
SUMMARY: Will not create duplicate Product, Version & Build if they already exist
GIVEN: TCMS_RUN_ID is not configured
AND: %(product) exists
AND: %(version) exists
AND: %(build) exists
WHEN: plugin tries to auto-create TestPlan and TestRun
THEN: plugin will re-use %(product), %(version) and %(build) from the database
AND: not try to auto-create them
SUMMARY: Will auto-create Product, Version & Build if they don't exist
GIVEN: TCMS_RUN_ID is not configured
AND: %(product) doesn't exist
AND: %(version) doesn't exist
AND: %(build) doesn't exist
WHEN: plugin tries to auto-create TestPlan and TestRun
THEN: %(product), %(version) and %(build) be created automatically
SUMMARY: Unit test names are added to TestPlan
GIVEN: we have good plugin configuration
WHEN: plugin loops over unit tests emitted by the test runner
THEN: plugin will check Current_TestPlan for a TestCase with the same name
AND: if test case doesn't exist in Current_TestPlan
THEN: it will be added to Current_TestPlan
hint: it is probably best to process all unit test results at the end!
SUMMARY: Unit test names are added to TestRun
GIVEN: we have good plugin configuration
WHEN: plugin loops over unit tests emitted by the test runner
THEN: plugin will check Current_TestRun for a TestCaseRun object which matches
the current unit test name
hint: (or Current_TestCase object from previous scenario, depending on implementation)
AND: if such TestCaseRun doesn't exist in Current_TestRun
THEN: it will be added to Current_TestRun
hint: it is probably best to process all unit test results at the end!
SUMMARY: Current_TestRun is updated with unit test results
GIVEN: we have good plugin configuration
WHEN: plugin loops over unit tests emitted by the test runner
THEN: plugin will check Current_TestRun for a TestCaseRun object which matches
the current unit test name
hint: (or Current_TestCase object from previous scenario, depending on implementation)
AND: if TestCaseRun object exists in Current_TestRun
THEN: its status will be updated with the execution result coming from the test runner
hint: it is probably best to process all unit test results at the end!
Happy testing!
A friend from Red Hat sent me an email asking about Kiwi TCMS performance so I did an experiment to establish a baseline. For API requests I got 7.5 req/sec or 130 msec/req which is 1.5x slower than GitHub!
I used perf-script
(gist here)
to measure that. The script takes the first 250 test cases from our test suite
and on every execution creates a new TestPlan (1 API request), then creates
new test cases (250 requests), adds cases to test plan (250 requests),
creates new product build (1 API request), creates new TestRun (1 API request),
adds test cases to test run (250 requests) and finally updates the statuses
(250 requests).
A total of 1003 API requests are sent to Kiwi TCMS every time you start this script! An example is available at TR #567!
On localhost, running the development server (./manage.py runserver
) with an
SQLite database I got:
$ time python perf-script
real 2m6.450s
user 0m1.069s
sys 0m0.331s
$ time python perf-script
real 2m7.472s
user 0m1.057s
sys 0m0.342s
$ time python perf-script
real 2m9.368s
user 0m1.072s
sys 0m0.351s
$ time python perf-script
real 2m9.197s
user 0m1.050s
sys 0m0.353s
This measures at 120 msec/req or 7.85 req/sec!
demo.kiwitcms.org is running on an
AWS t2.micro
instance (via docker-compose) with the default centos/mariadb
image!
No extra settings or changes. I used the same computer over a WiFi
connection and a pretty standard home-speed Internet connection. Times are:
$ time python perf-script
real 2m18.983s
user 0m1.175s
sys 0m0.095s
$ time python perf-script
real 2m25.937s
user 0m1.156s
sys 0m0.108s
$ time python perf-script
real 2m24.120s
user 0m1.102s
sys 0m0.098s
$ time python perf-script
real 2m21.521s
user 0m1.154s
sys 0m0.091s
This measures at 140 sec/req or 7.05 req/sec!
Note: since I am using Python 3.6 I had to modify the file
/opt/rh/rh-python36/root/lib64/python3.6/ssl.py
to read:
# Used by http.client if no context is explicitly passed.
_create_default_https_context = _create_unverified_context # this disables HTTPS cert validation
The issue has been reported in RHBZ #1643454
Happy testing!
I am happy to announce that our team is steadily growing! As we work through our roadmap, status update here, and on-board new team members I start to feel the need for a bit more structure and organization behind the scenes. I also wish for consistent contributions to the project (commit early, commit often) so I can better estimate the resources that we have!
I am also actively discussing Kiwi TCMS with lots of people at various conferences and generate many ideas for the future. The latest SEETEST in Belgrade was particularly fruitful. Some of these ideas are pulling into different directions and I need help to keep them under control!
Development-wise sometimes I lose track of what's going on and who's doing what between working on Kiwi TCMS, preparing for conferences and venues to promote the project, doing code review of other team members, trying not to forget to check-in on progress (especially by interns), recruiting fresh blood and thinking about the overall future of the project. Our user base is growing and there are days where I feel like everything is happening at once or that something needs to be implemented ASAP (which is usually true anyway)!
Meet Rayna Stankova in the role of our team coach! Reny is a director for Women Who Code Sofia, senior QA engineer at VMware, mentor with CoderDojo Bulgaria and a long-time friend of mine. Although she is an experienced QA in her own right she will be contributing to the people side of Kiwi TCMS and less so technically!
Her working areas will be planning and organization:
and generally serving as another very experienced member of the team!
We did a quick brainstorming yesterday and started to produce results (#smileyface)! We do have a team docs space to share information (non-public for now, will open it gradually as we grow) and came up with the idea to use Kiwi TCMS as a check-list for our on-boarding/internship process!
I don't know how it will play out but I do expect from the team to self-improve, be inspired, become more focused and more productive! All of this also applies to myself, even more so!
Last year we started with 2 existing team members (Tony and myself) and 3 new interns (Ivo, Kaloyan and Tseko) who built this website!
Tony is the #4 contributor to Kiwi TCMS in terms of number of commits and is on track to surpass one of the original authors (before Kiwi TCMS was forked)! He's been working mostly on internal refactoring and resolving the thousands of pylint errors that we had (down to around 500 I think). This summer Tony and I visited the OSCAL conference in Tirana and hosted an info booth for the project.
Ivo is the #5 contributor in terms of numbers of commits. He did learn very quickly and is working on getting rid of the remaining pylint errors. His ability to adapt and learn is quite impressive actually. Last month he co-hosted a git workshop at HackConf, a 1000+ people IT event in Sofia.
Kaloyan did most of the work on our website initially (IIRC). Now he is studying in the Netherlands and not active on the project. We are working to reboot his on-boarding and I'm hoping he will find the time to contribute to Kiwi TCMS regularly.
From the starting team only Tseko decided to move on to other ventures after he contributed to the website.
At Kiwi TCMS we have a set of training programs that teach all the necessary technical skills before we let anyone actively work on the project, let alone become a team member.
Our new interns are Denitsa Uzunova and Desislava Koleva. Both of them are coming from Vratsa Software Community and were mentors at the recently held CodeWeek hackathon in their home city! I wish them fast learning and good luck!
Happy testing!
In this blog post I will show more ways to customize Kiwi TCMS by adding logging capabilities to the API backend. Indeed this is a feature that our team deemed not required for upstream and was removed in PR #436.
Start by creating the following directory structure:
api_logging/
__init__.py
handlers.py
models.py
This is a small Django application that will log every call to the API backend. Each file looks like this:
# models.py contains DB schema for your new table
from django.db import models
from django.conf import settings
class ApiCallLog(models.Model):
executed_at = models.DateTimeField(auto_now_add=True)
user = models.ForeignKey(settings.AUTH_USER_MODEL, null=True, blank=True,
on_delete=models.CASCADE)
method = models.CharField(max_length=255)
args = models.TextField(blank=True)
def __str__(self):
return "%s: %s" % (self.user, self.method)
Then
# handlers.py overrides the RPC handlers coming from django-modernrpc
from modernrpc import handlers
from django.conf import settings
from django.contrib.auth import get_user_model
from .models import ApiCallLog
def log_call(request, method_name, args):
""" Log an RPC call to the database or stdout in DEBUG mode. """
request_user = request.user
if not request_user.is_authenticated:
# create an anonymous user object for logging purposes
request_user, _ = get_user_model().objects.get_or_create(
username='Anonymous',
is_active=False)
if method_name is None:
method_name = '--- method_name missing ---'
if settings.DEBUG:
print('API call:: user: {0}, method: {1}, args: {2}'.format(
request_user,
method_name,
args))
else:
ApiCallLog.objects.create(
user=request_user,
method=method_name,
args=str(args))
class XMLRPCHandler(handlers.XMLRPCHandler):
def process_request(self):
encoding = self.request.encoding or 'utf-8'
data = self.request.body.decode(encoding)
params, method_name = self.loads(data)
log_call(self.request, method_name, params)
return super().process_request()
class JSONRPCHandler(handlers.JSONRPCHandler):
def process_single_request(self, payload):
method_name = payload.get('method', None)
params = payload.get('params')
log_call(self.request, method_name, params)
return super().process_single_request(payload)
NOTE:
You will have to execute ./manage.py makemigrations api_logging
to create the
initial migration for Django. This could be easier if you place the above directory
into existing Django application or craft the migration file by hand!
The last thing you want to do is create a local_settings.py
file which will override
Kiwi TCMS defaults:
# local_settings.py
from django.conf import settings
settings.INSTALLED_APPS += [
'api_logging',
]
MODERNRPC_HANDLERS = ['api_logging.handlers.XMLRPCHandler',
'api_logging.handlers.JSONRPCHandler']
Then place everything in Dockerfile
like so:
FROM kiwitcms/kiwi
COPY ./api_logging/ /venv/lib64/python3.6/site-packages/api_logging/
COPY local_settings.py /venv/lib64/python3.6/site-packages/tcms/settings/
Kiwi TCMS will import your local_settings.py
and enable the logging application.
Now build your customized Docker image and use it for deployment!
Happy testing!
This is the first publication in our customization series. It will show you how to override any template used by Kiwi TCMS. As an example we will override the email template that is used when registering new account. By default the email text looks like this:
Welcome {{ user }},
thank you for signing up for an {{ site_domain }} account!
To activate your account, click this link:
{{ confirm_url }}
https://demo.kiwitcms.org
runs a custom Docker image based on
kiwitcms/kiwi
. For this image the confirmation email looks like this
Welcome {{ user }},
thank you for signing up for our Kiwi TCMS demo!
To activate your account, click this link:
{{ confirm_url }}
GDPR no longer allows us to automatically subscribe you to
our newsletter. If you wish to keep in touch and receive emails
with news and updates around Kiwi TCMS please subscribe at:
https://kiwitcms.us17.list-manage.com/subscribe/post?u=9b57a21155a3b7c655ae8f922&id=c970a37581
--
Happy testing!
The Kiwi TCMS team
http://kiwitcms.org
The file that we want to override is tcms/templates/email/confirm_registration.txt
.
Create a local directory (git repository) which will hold customization configuration
and create a file named templates.d/email/confirm_registration.txt
with your text!
Next you want to make this file available inside your docker image so your Dockerfile
should look like this:
FROM kiwitcms/kiwi
COPY ./templates.d/ /venv/lib64/python3.6/site-packages/tcms/overridden_templates/
COPY local_settings.py /venv/lib64/python3.6/site-packages/tcms/settings/
where local_settings.py
contains
import os
from django.conf import settings
settings.TEMPLATES[0]['DIRS'].insert(0, os.path.join(settings.TCMS_ROOT_PATH, 'overridden_templates'))
The following code states instruct Django to look into overridden_templates
first and
use any templates it finds there; also make my files available in that specific location
inside the docker image.
This approach can be used for all templates that you wish to override. Take into account that file names must match (Django searches templates by directory path). Now build your customized Docker image and use that for deployment!
Happy testing!
When you start Kiwi TCMS by running docker-compose up
(see here)
it will automatically create 2 volumes: kiwi_db_data
and kiwi_uploads
.
This blog post will outline how to backup these docker volumes.
Kiwi TCMS is a Django application and the manage.py
command provides an easy way
to dump and load the database contents. To export all contents on your docker host
execute:
docker exec -it kiwi_web /Kiwi/manage.py dumpdata --all --indent 2 > database.json
This will create a file named database.json
in the current directory, outside of the
running container!
You can restore the database contents by using the following commands:
# delete data from all tables
docker exec -it kiwi_web /bin/bash -c '/Kiwi/manage.py sqlflush | /Kiwi/manage.py dbshell'
# then reload the existing data
cat database.json | docker exec -i kiwi_web /Kiwi/manage.py loaddata --format json -
NOTE: depending on your scenario you may want to remove the existing volume
(docker-compose down && docker volume rm kiwi_db_data
) and re-create the
DB schema (/Kiwi/manage.py migrate
) before restoring the contents!
WARNING: the above steps are applicable to Kiwi TCMS 5.1 or above. On earlier
versions manage.py
will fail due to various issues.
Uploaded files can easily be backed up with:
docker exec -it kiwi_web /bin/tar -cP /Kiwi/uploads > uploads.tar
and then restored:
cat uploads.tar | docker exec -i kiwi_web /bin/tar -x
You may also try the rsync
command but be aware that it is not installed
by default!
The same approach may be used to backup /var/lib/mysql/
from the kiwi_db
container.
By default both docker volumes created for Kiwi TCMS use the local
driver
and are available under /var/lib/docker/volumes/<volume_name>
on the host
running your containers. You can try backing them up from there as well.
Another alternative is to use the
docker-lvm-plugin
and create these volumes as LVM2 block devices. Then use
lvcreate -s
command to create a snapshot volume. For more information see
chapter 2.3.5. Snapshot Volumes
from the LVM Administrator Guide for Red Hat Enterprise Linux 7.
Happy testing!
Hello everyone, in this article I will outline the progress that the Kiwi TCMS team has made towards achieving the goals on our roadmap.
Status: moderate progress
Initially CodeClimate reported a "D" rating with a 1 year estimated effort. Now it is still on "D" rating with a 7 months estimated effort to bring the project back in shape. Code smells have dropped from 600+ to 418, duplications have been reduced from 600+ to 359! At the same time technical debt ratio has been decreased from 32,5% to 21,6% and little over 10000 lines of code have been removed from the source code. Checkout the stats for more info!
Status: good progress
Both pylint and pylint-django have been integrated into our CI workflow. There are even a few custom built plugins that we use. The number of issues reported is down to around 900 from 4000+ initially. The cleanup has been lead by Anton Sankov with help from Ivaylo Ivanov and myself.
Status: no progress
Several views were probably modified to return pure JSON in the meantime but we've not done any targeted work to resolve this issue.
Status: no progress
Same as above, not much has been done in this area.
Status: complete
After Kiwi TCMS v4.0 the server side API has been reorganized and updated to follow the model/method names used internally.
After the recent version 5.0 the client side API library has been stripped to its most basic form so that you can work directly with the responses from the server.
There is no more duplication and ambiguity in names because there isn't a lot of code left!
Status: moderate progress, dropped
All RPC methods have been documented! The rest of the internals will be documented as we go along.
Status: moderate progress
Several JavaScript libraries have been removed but we still carry around jQuery and Handlebars.js. No work has been done to convert Kiwi TCMS to use the jQuery version provided with Django.
Status: minimal progress
There are still over 100 HTML templates in Kiwi TCMS. Some of the HTML templates have been merged together, some email templates have been refactored and marked as translatable but the majority of them have not been updated for a long time.
Status: no progress
Status: small progress
A number of JavaScript functions have been refactored and removed during the past few releases but there are still thousands of lines of code left to deal with.
Status: moderate progress
We are seeing a steady stream of new users registered on https://demo.kiwitcms.org and there are several active contributors (issues, translations).
Kiwi TCMS was represented at OSCAL Tirana, DjangoCon Heidelberg and PyCon Prague! We're planning to attend HackConf and OpenFest in Sofia by the end of the year.
Happy testing!