Install QQ/TIM in Linux with Wine

This tutorial introduces how to install QQ/TIM in Linux with Wine, which had been tested on ArchLinux with Wine 2.4. Prerequisites Before start, you need to get the latest Wine. I'm not sure whether QQ/TIM can run on lower version of Wine. In ArchLinux, you can easily get the latest Wine using following command: However, in Debian, you need to install Wine with some more steps. You can see this tutorial. Then, you need to install a helper of Wine, Winetricks. Winetricks is a script to download and install various redistributable runtime libraries needed to run some programs in Wine. To install Winetricks, you can use following command: After that, we need to fix some problems manually caused by Winetricks. According…

Continue Reading

Install SciPy on Windows

In this tutorial, you will be setting up a numerical Python development environment for Windows 10. As you might have already realized, Python is rather simple to setup on a Linux/macOS box but as with many open source based projects getting up and running on Windows is never trivial. Good solutions for Windows are, Enthought Canopy, Anaconda (which both provide binary installers for Windows, OS X and Linux) and Python (x, y). Both of these packages include Python, NumPy and many additional packages. However, you can still install these packages manually. Prerequisites Before you start, I assume that you have installed Python with pip on your Windows correctly. You need to download following packages: Microsoft Visual C++ Compiler for Python 2.7 Source…

Continue Reading

Configuring Apereo CAS

This tutorial is designed to help a new CAS user to setup Apereo CAS server and client to their applications. The code of this tutorial is open sourced on GitHub. What's CAS? Enterprise Single Sign-On - CAS provides a friendly open source community that actively supports and contributes to the project. While the project is rooted in higher-ed open source, it has grown to an international audience spanning Fortune 500 companies and small special-purpose installations. CAS provides enterprise single sign-on service for the Web: An open and well-documented protocol An open-source Java server component Pluggable authentication support (LDAP, database, X.509, 2-factor) Support for multiple protocols (CAS, SAML, OAuth, OpenID) A library of clients for Java, .Net, PHP, Perl, Apache, uPortal, and others Integrates with…

Continue Reading

Create a Desktop App with Angular 2 and Electron

Last week, I took part in the Google Developer Day held in Beijing. The Angular team introduced their new Angular 2. Angular is a development platform for building mobile and desktop web applications. This tutorial will  show how to configure and use Angular 2 web components with Electron framework for creating native cross-platform applications with web technologies. As recommended by Angular team, TypeScript will be used throughout this tutorial. TypeScript is a typed superset of JavaScript that compiles to plain JavaScript. Any browser. Any host. Any OS. Open source. You will get a link to finished working example at GitHub at the end of the article. Prerequisites Before start, please make sure that node, npm, typescript, typings are installed. Setup Electron with…

Continue Reading

Gradient Boosting Decision Tree

In the previous article, we've talked about AdaBoost which combines output of weak learners into a weighted sum that represents the final output of the boosted classifier. If you know little about AdaBoost or additive model, we highly recommend you read the article first. Gradient boosting is a machine learning technique for regression and classification problems, which produces a prediction model in the form of an ensemble of weak prediction models, typically decision trees. It builds the model in a stage-wise fashion like other boosting methods do, and it generalizes them by allowing optimization of an arbitrary differentiable loss function. Boosting Tree Boosting tree is based on additive model which can be repsentated as following: \(\begin{align}f_M(x) = \sum_{m=1}^{M} T(x; \theta_m)\end{align}\) where \(T(x; \theta_m)\)

Continue Reading

AdaBoost

AdaBoost, short for "Adaptive Boosting", is a machine learning meta-algorithm formulated by Yoav Freund and Robert Schapire who won the Gödel Prize in 2003 for their work. The output of the other learning algorithms (weak learners) is combined into a weighted sum that represents the final output of the boosted classifier. AdaBoost is adaptive in the sense that subsequent weak learners are tweaked in favor of those instances misclassified by previous classifiers. AdaBoost is sensitive to noisy data and outliers and is quite robust to overfitting. Bagging vs. Boosting A too complex model (unpruned decision trees) have high variance but low bias whereas a too simple model (Weak learners like decision stumps) have high bias but low variance. To minimize…

Continue Reading
Contact Us
  • Room 311, Zonghe Building, Harbin Institute of Technology
  • cshzxie [at] gmail.com