Blog

  • docker-pyload

    auto-update dockerhub Docker Pulls

    docker-pyload

    Install Pyload-ng pypi version into a Linux container

    pyload

    Tag

    Several tag are available:

    Description

    pyLoad is a free and open source downloader for 1-click-hosting sites like rapidshare.com or uploaded.to. It supports link decryption as well as all important container formats.

    https://github.com/pyload/pyload

    Usage

    docker create --name=pyload  \
      -v <path to data>:/config \
      -v <path to downloads>:/downloads \
      -v <path to temporary downloads>:/temporary-downloads \
      -e UID=<UID default:12345> \
      -e GID=<GID default:12345> \
      -e AUTOUPGRADE=<0|1 default:0> \
      -e TZ=<timezone default:Europe/Brussels> \
      -e DOCKMAIL=<mail address> \
      -e DOCKRELAY=<smtp relay> \
      -e DOCKMAILDOMAIN=<originating mail domain> \
      -p 8000:8000  \
      -p 7227:7227  \
      -p 9666:9666 \
    digrouz/pyload
    

    Environment Variables

    When you start the pyload image, you can adjust the configuration of the pyload instance by passing one or more environment variables on the docker run command line.

    UID

    This variable is not mandatory and specifies the user id that will be set to run the application. It has default value 12345.

    GID

    This variable is not mandatory and specifies the group id that will be set to run the application. It has default value 12345.

    AUTOUPGRADE

    This variable is not mandatory and specifies if the container has to launch software update at startup or not. Valid values are 0 and 1. It has default value 0.

    TZ

    This variable is not mandatory and specifies the timezone to be configured within the container. It has default value Europe/Brussels.

    DOCKRELAY

    This variable is not mandatory and specifies the smtp relay that will be used to send email. Do not specify any if mail notifications are not required.

    DOCKMAIL

    This variable is not mandatory and specifies the mail that has to be used to send email. Do not specify any if mail notifications are not required.

    DOCKMAILDOMAIN

    This variable is not mandatory and specifies the address where the mail appears to come from for user authentication. Do not specify any if mail notifications are not required.

    DOCKUPGRADE

    This variable is not mandatory and specifies if the container has to launch software update at startup or not. Valid values are 0 and 1. It has default value 0.

    Notes

    • This container is built using s6-overlay
    • The docker entrypoint can upgrade operating system at each startup. To enable this feature, just add -e AUTOUPGRADE=1 at container creation.
    • The port 8000 is used for webui
    • The port 7227 is used for the api
    • The port 9666 is used for click n load plugin

    Issues

    If you encounter an issue please open a ticket at github

    Visit original content creator repository
  • dataJobs_project

    dataJobs_project

    Se espera crear un modelo para predecir las habilidades más demandadas en el mercado laboral. Esto puede ayudar a los profesionales, estudiantes y la industria en general a tomar decisiones informadas sobre qué habilidades desarrollar y mejorar.

    El objetivo principal del proyecto es analizar las ofertas laborales en el campo de los datos y con el propósito de identificar patrones y tendencias que revelen las habilidades más demandadas en el mercado laboral actual. Esta investigación puede ayudar a las personas interesadas en este ámbito o que quieran meterse a este, ayudando a determinar si es la predicción deseada como una decisión laboral. Todo esto mediante el dataset de kaggle Data Jobs Listings – Glassdoor.

    Archivos del repositorio

    • EDA_project.ipynb: Esta todo el analisis exploratorio de las tablas usadas que son “glassdoor.csv”, “glassdoor_benefits_highlights.csv” y “glassdoor_salary_salaries.csv” (descargar estos csv para la ejecucion del codigo).

    • ETL_project.pdf: Es el documento en el cual estan descritas las fases realizadas y explicadas a detalles.

    • dataJobs_script.py: Este el codigo principal en el que se realizo la conexion a postgresql, se cargaron los datos a la BD, se reemplazaron los valores nulos en algunas columnas, campos vacios y normalizaciones para los titulos del trabajo.

    • df_tocsv_and_transfomations.py: Aqui se tuvo que usar pandas para poder incrustar correctamente el csv y donde se realizo la mayor transformación de datos para la normalizacion del jobTitle.

    • dimesions_script.py: Aqui se hace la inserción de datos en las ds dimensiones que se va a hacer uso mas en el futuro del proyecto.

    • project_dashboard.pdf: Es el dashboard que realice en power bi, con tres graficas por ahora.

    ¿Como correr los scripts?

    1. Clone el repositorio con https://github.com/VinkeArtunduaga/dataJobs_project.git
    2. Instale python 3.11
    3. Instale la base de datos PostgreSQL
    4. Para las librerias es necesario hacer pip install psycopg2, pip install pandas y seran instaladas, tambien se usan json y csv pero se supone que estan predeterminadas.
    5. Crear un usuario y contraseña para el uso de postgreSQL
    6. Crear una database en pgAdmin 4 llamada ETL (asi fue como le puse el nombre a la mia pero se puede cambiar)
    7. Cambie las configuraciones de la conexion a la base de datos segun el usuario, contraseña y database asignados
    8. Corra primero el codigo de df_tocsv_and_transformations.py en la terminal mediante python df_tocsv_and_transformations.py
    9. Luego python dataJobs_script.py para de esta manera crear la tabla principal con sus normalizaciones y limpieza de nulos.
    10. Para luego correr el de la creacion de las dimesiones python dimensions_script.py

    En caso de querer realizar el proceso de EDA:

    1. Descargar Jupyter para mas facilidad con Jupyter lab
    2. Al ya tener descargadas las librerias de json y pandas o no haber podido ser instaladas ejecutar en la terminal pip install pandas y pip install json
    3. Cambiar la direccion de donde se encuentran los archivos csv de glassdoor.csv, glassdoor_benefits_highlights.csv y glassdoor_salary_salaries.csv.
    4. Al ejecutar cada uno de los bloques o de corrido deberia apreciarse el analisis.

    Para la segunda parte del proyecto hice una carpeta llama API ahi estan todo lo que se realizo.

    Visit original content creator repository

  • moodle-auth_cfour

    a Moodle link based authentication plugin

    MFreak.nl

    • Author: Luuk Verhoeven, MFreak.nl
    • Min. required: Moodle 3.6.x
    • Supports PHP: 7.0 | 7.1 | 7.2

    Moodle36 PHP7.0

    List of features

    • Shared key encryption.
    • Authenticate a user with a direct link.
    • Redirect after login to a redirect url.
    • Allow locking userfields.

    Installation

    1. Copy this plugin to the auth\cfour folder on the server
    2. Login as administrator
    3. Go to Site Administrator > Notification
    4. Install the plugin
    5. Add the correct AUTHENTICATION key to the settings /admin/settings.php?section=authsettingcfour.
    6. Enable the authentication module /admin/category.php?category=authsettings.

    Usage

    1. A user should already be created in Moodle. You can add a new user with the API, make sure auth property must have the value cfour for security reasons.
    2. Build a link on a external system, see the example below.
    <?php
    #External system, samplecode
    define('SHARED_AUTHENTICION_KEY' , 'LONG_KEY_HERE');
    
    /**
     * Get a authentication code
     * 
     * @param int $userid
     * @param string $username
     * @return string
     */
    function get_code(int $userid , string $username ){
        return hash('sha256', SHARED_AUTHENTICION_KEY . '|' . $userid . '|' . $username);
    }
    
    // Building the link.
    $domain = 'https://moodle.test.domain.com/';
    $plugin = 'auth/cfour/login.php?';
    
    // The user that exists in Moodle and has `cfour` auth property in there account.
    $moodleusername = 'student1';
    $moodleuserid = 2;
    
    $params = [
        'sso_username' => $moodleusername,
        'sso_code' => get_code($moodleuserid , $moodleusername),
        'wantsurl' => '/course/view.php?id=2'
    ];
    
    // Make sure all params get urlencoded!
    $url = $domain . $plugin . http_build_query($params);
    
    // https://moodle.test.domain.com/auth/cfour/login.php?sso_username=student1&sso_code=&wantsurl=%2Fcourse%2Fview.php%3Fid%3D2
    header('Location: ' . $url);
    die;
    1. Use the link where you want. Keep in mind there is no expiry date implemented.

    Security

    If you discover any security related issues, please email luuk@MFreak.nl instead of using the issue tracker.

    License

    The GNU GENERAL PUBLIC LICENSE. Please see License File for more information.

    Changelog

    See version control for the complete history. Major changes in this version will be listed below.

    Visit original content creator repository
  • url_shortener

    URL Shortener URL Instructions

    If you want to build the next bit.ly, goo.gl, or ow.ly yourself. Here are a project you might consider to start.

    URL Shortener is a URL shortening service where you enter a URL such as https://codesubmit.io/library/react and it returns a short URL such as http://short.est/GeAi9K.

    Simply ensure that a URL can be encoded into a short URL and that the short URL can be decoded back
    into the original URL.

    Installation

    Following those steps below for install project:

    git clone https://github.com/lytrungtin/url_shortener.git
    • Go to project folder
    cd url_shortener
    • Copy env values file from example

    cp .env.example .env
    cp config/database.yml.sample config/database.yml
    • Bundle install
    bundle install
    • Prepare database
    rails db:reset
    • Run tests
    rails test
    • Starting backend server
    rails server

    Open new tab of terminal to begin frontend running.

    • Install node modules required for web frontend
    npm install
    • Starting web frontend application
    npm run start

    URL Shortener

    Description:

    Nobody likes an impossibly long URL.

    They’re hard to decipher. But sometimes, between a deep directory structure on a site, plus a large number of parameters tacked on to the end, URLs just begin to get unwieldy. And back in the days before Twitter added their own link shortener to their service, a long URL meant taking precious characters away from your tweets.

    Today, people use link shorteners for a slew of reasons. They can make it much easier to type, or remember, an otherwise lengthy bare URL. They can bring a consistent branding to a social media account. They make it easier to perform analytics across a group of URLs. They make it possible to provide a consistent entryway to a URL that may change frequently on the other side.

    There are some challenges to URL shorteners, to be sure. They make it challenging to figure out where a link is actually taking you before you click, and they’re vulnerable to linkrot, should the service providing the short URL for you ever disappear. But despite these challenges, URL shorteners aren’t going anywhere anytime soon.

    But with so many free link shortening services out there, why roll your own? In short: control. While some services will let you pick your own domain to use, sometimes, that’s about the level of customization you’re going to get. With a self-hosted service, you decide how long your service operates for. You decide what format your URLs take. You decide who has access to your analytics. It’s yours to own and operate as you please.

    (Jason Baker, Want to build your own URL shortener?)

    • Two endpoints are provided:
      • /encode: Encodes a URL to a shortened URL
      • /decode: Decodes a shortened URL to its original URL
    • Here is Postman collection you can download:
    • Links to demo instances
    • The project uses the latest optimization versions from Rails, React

    Screenshot from web frontend:

    img.png

    API endpoints Usage

    Valid request to /encode endpoint:

    POST /api/v1/url/encode
    
    {
      "url": {
        "original_url": "https://codesubmit.io/library/react"
      }
    }

    Body success response from /encode endpoint:

    200 OK
    
    {
      "status": true,
      "data": [
        {
          "shortened_url": "http://localhost:3000/rxjODc"
        }
      ]
    }

    Invalid request to /encode endpoint:

    POST /api/v1/url/encode
    
    {
      "url": {
        "original_url": "https://google.com/something_wrong"
      }
    }

    Body failure response from /encode endpoint:

    422 Unprocessable Entity
    
    {
      "status": false,
      "errors": [
        "Original url is invalid"
      ]
    }

    Valid request to /decode endpoint:

    POST /api/v1/url/decode
    
    {
        "url": {
            "shortened_url": "http://localhost:3000/rxjODc"
        }
    }

    Body success response from /decode endpoint

    200 OK
    
    {
      "status": true,
      "data": [
          {
              "original_url": "https://codesubmit.io/library/react"
          }
      ]
    }

    Invalid request to /decode endpoint

    POST /api/v1/url/decode
    
    {
        "url": {
            "shortened_url": "https://codesubmit.io/this_is_not_shortened_url"
        }
    }

    Body failure response from /decode endpoint

    422 Unprocessable Entity
    
    {
        "status": false,
        "errors": [
            "Shorten URL is not valid"
        ]
    }

    Known issues:

    • Potential issues:

      • Users can exploit Xkcd 1171 to insert malicious URLs into our endpoints.
        The Ruby standard library already comes with an URI parser, accessible via URI.parse.
      • Users can also pass links with vulnerable to Link rot,
        An HTTP client for Ruby Net::HTTP provides a very powerful library.
        Net::HTTP is designed to work with URI.
        Finally, I will check if the request was successful or will be redirected to another url.
      • Rails provides an API to create custom validations. We will use this to build a URL validator that.
      • Also, the original url should not be from current host. Which only for shortened url.
      • The validator code below is being placed in app/validators/url_validator.rb
    • Scalability issues:

      • The project uses Net:HTTP to get request from other sites a lot.
      • In addition to queries database, Take full advantage of Rails’ built-in action, page and fragment caching.
      • Use memcache to cache results that you’d otherwise pull from your database or request info from urls.
      • In addition to using database to store slugs along original urls, we can generate them to html, json, yaml.
      • Implement use RDMS less, fully change to Redis for storing. Remove database connection for all requests.
      • Reconfiguring the server when accessing it will redirect directly from the file without querying any data from the database.
    • Maintainability:

      • Should implement RSPEC for unit test.
      • Github actions, Heroku pipeline, PR preview env, staging env already installed workflows, and activated package issues security alert bots can proactively inform notifications if dependencies need to be updated or replaced.

    Visit original content creator repository

  • apple_complete

    bash completion for Apple tools

    These are a number of scripts designed to add programmable completion to
    bash for some of the command line tools. They complete the options for
    each tool, for most the completion cuts in after - or --.

    If you have installed bash completion using brew then drop the files
    into /usr/local/etc/bash_completion.d/ (or link them) and open a new
    Terminal window.

    Feedback, bug reports and comments would be greatly appreciated. If you
    want to nominate a CLI tool for me to add to this set, drop me a note or
    open an issue.

    For zsh

    You can use bash completions with zsh just close to the top of .zshrc put :

    # bash compatible completion
    autoload bashcompinit
    bashcompinit
    

    and the line source <path to directory of completions>/* right at the end, e.g.
    source ~/bin/bash_completion/*.

    Notes

    bless

    For bless it first completes on the action option and then
    completes on the options for that action.

    diskutil

    For diskutil it first completes on the verb and then the options for
    the verb. For options prefaced with - it will only complete after you
    enter the -. For verbs such as eraseDisk where you specify a format
    argument it will also complete the formats. No work has been done to
    create more specific completion than at the verb level or to make sure
    you have options and arguments in the right order. Quite a lot of the
    verbs take no options.

    hdiutil

    For hdiutil it first completes on the verbs and then completes on the
    options for each verb once you enter the -. No work has been done to
    create more specific completion than at the verb level or to make sure
    you have options and arguments in the right order.

    quickpkg

    It will complete all the options. If you enter the option --ownership
    it will then complete on the the three possibilities. If you enter the
    option --keychain then it will complete on all the keychains the
    system knows about.

    networksetup

    If you enter a - it will complete on the options (there are a heck of
    a lot). Most options require a service name, if you enter " then it
    will complete on the current service names.

    If you want to complete on a hardware port then enter ' (that’s a single
    quote) then it will complete on the hardware ports. I’m still figuring out
    how to handle the few options that want either a hardware port or
    device name.

    The Airport and wireless options want a device name (‘en0’, ‘bridge0’ etc)
    so they will complete the device names. This is handier on Macs with a
    bunch of psuedo devices.

    More expansions to this may come. Feedback appreciated.

    Visit original content creator repository

  • Android-Template-Activity-with-MVP-and-Repository-Pattern

    Android Studio Templates

    An Android Studio templates for Android development.

    To use these templates you will need Android Studio.

    Copy the appropriate folders into <androidStudio-folder>/plugins/android/lib/templates/ and they will appear in the project explorer context menu.

    The templates folder contains these templates:

    • [MVPActivity]: creates an android activity creates MVPActivity with Repositary layer
    • [MVPLoginActivity]: creates an android activity creates MVPActivity for LoginScreen

    MVP with Respositary Pattern

    Creates a new blank activity with a MVP with Respositary Pattern.

    Copy MVP with Respositary Pattern folder in your Android Studio installation in this folder: <androidStudio-folder>/plugins/android/lib/templates/activities

    Restart Android Studio, and you will find it in: New -> Activity -> MVP with Respositary Pattern Screen Screen Screen Screen

    Encourgement

    Acknowledgements

    • Thanks to Android Studio’s original templates

    Credits

    Author: Attiq Ur Rehman (attiq.ur.rehman1991@gmail.com)

    Follow me on LinkedIn Follow me on Google+ Follow me on Twitter

    License

    MIT License

    Copyright (c) 2017 Attiq ur Rehman

    Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

    The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

    THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.


    Visit original content creator repository
  • Netflix

    This is a Next.js project bootstrapped with create-next-app.

    Getting Started

    First, run the development server:

    npm run dev
    # or
    yarn dev

    Open http://localhost:3000 with your browser to see the result.

    You can start editing the page by modifying pages/index.js. The page auto-updates as you edit the file.

    API routes can be accessed on http://localhost:3000/api/hello. This endpoint can be edited in pages/api/hello.js.

    The pages/api directory is mapped to /api/*. Files in this directory are treated as API routes instead of React pages.

    Setup Local Environment

    You need to setup a few API keys for this project to be setup correctly otherwise you won’t see any videos.

    For that, you need to create a .env.local file in your project as shown in docs that will look like this:

    NEXT_PUBLIC_HASURA_ADMIN_URL=<REPLACE THIS>
    JWT_SECRET=<REPLACE THIS>
    NEXT_PUBLIC_HASURA_ADMIN_SECRET=<REPLACE THIS>
    MAGIC_SERVER_KEY=<REPLACE THIS>
    NEXT_PUBLIC_MAGIC_PUBLISHABLE_API_KEY=<REPLACE THIS>
    YOUTUBE_API_KEY=<REPLACE THIS>
    

    You can retrieve the above environment values by referring their docs linked above and once retrieved, paste above accordingly.

    *** Important: Videos from Youtube ***

    During local development, we recommend you to add the environment variable DEVELOPMENT=true as that won’t fetch videos from the Youtube API and instead access it from data/videos.json. Youtube API does have a quota and this way you can continue doing local development without worrying about running out of API calls.

    Learn More

    To learn more about Next.js, take a look at the following resources:

    You can check out the Next.js GitHub repository – your feedback and contributions are welcome!

    Deploy on Vercel

    The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

    Check out our Next.js deployment documentation for more details.

    Visit original content creator repository

  • Draw2Text

    👁 Draw2Text: AI-powered digit recognition app

    Open in Streamlit

    Draw2Text is an innovative web application that recognizes numbers and letters drawn by users in the canvas. It uses advanced machine learning algorithms, including computer vision and deep learning, to segment the drawn image, predict the digit/letter label, and display the result to the user. The app also enables users to provide feedback, which can be used to fine-tune the model for better accuracy in future predictions.

    How it works

    1. The user draws a digit or letter on the provided canvas
    2. The app uses OpenCV, an open-source computer vision library, to segment the canvas in digits images
    3. The segmented image is then passed to a neural network created with Keras, which uses deep learning techniques to predict the digit label.
    4. The predicted label is then displayed to the user.
    5. The user can provide feedback to improve the model’s accuracy in future predictions.
    graph TD
        A["User Drawn Digit"] --> B["OpenCV Segmentation"]
        B --> C["Keras Model"]
        C --> D["Digit Label Prediction"]
        D --> E["Display Predictions"]
        E --> F["User Feedback for Future Tuning"]
        F --> C
    
    Loading

    Getting Started

    To run the app locally, please follow these steps:

    1. Clone the repository
    git clone https://github.com/olucaslopes/Draw2Text.git
    
    1. Install the required packages
    pip install -r requirements.txt
    
    1. Run the app
    streamlit run app.py
    

    Technologies used

    • Cloudinary to Cloud Storage the drawn digits and user true labels feedback for future tuning
    • Streamlit for the front end
    • TensorFlow Keras for model development
    • Image Hashing to avoid duplicated images
    • OpenCV to image segmentation through contours
    • Pandas and Numpy for data manipulation

    Model

    The app uses a TensorFlow Keras Multi Layer Perceptron (MLP) model to predict the digits drawn.

    Contact

    You can find out more about me on my Linkedin

    Visit original content creator repository