Blog

  • tokyo

    tokyo

    greed2411

    When you hit rock-bottom, you still have a way to go until the abyss.- Tokyo, Netflix’s “Money Heist” (La Casa De Papel)



    image belongs to teepublic

    When one is limited by the technology of the time, One resorts to Java APIs using Clojure.

    This is my first attempt on Clojure to have a REST API which when uploaded a file, identifies it’s mime-type, extension and text if present inside the file and returns information as JSON. This works for several type of files. Including the ones which require OCR, thanks to Tesseract. Complete list of supported file formats by Tika.

    Uses ring for Clojure HTTP server abstraction, jetty for actual HTTP server, pantomime for a clojure abstraction over Apache Tika and also optionally served using traefik acting as reverse-proxy.

    Installation

    Two options:

    1. Download openjdk-11 and install lein. Followed by lein uberjar
    2. Use the Dockerfile (Recommended)

    Building

    1. You can obtain the .jar file from releases (if it’s available).
    2. Else build the docker image using Dockerfile.
    docker build ./ -t tokyo
    docker run tokyo:latest
    

    Note: the server defaults to running on port 80, because it has been exposed in the docker image. You can change the port number by setting an enviornment variable TOKYO_PORT inside the Dockerfile, or in your shell prompt to whichever port number you’d like when running the .jar file.

    I’ve also added a docker-compose.yml which uses traefik as reverse proxy. use docker-compose up.

    Usage

    1. the /file route. make a POST request by uploading a file.

      • the command line approach using curl
      curl -XPOST  "http://localhost:80/file" -F file=@/path/to/file/sample.doc
      
      {"mime-type":"application/msword","ext":".bin","text":"Lorem ipsum \nLorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc ac faucibus odio."}
      >>> import requests
      >>> import json
      
      >>> url = "http://localhost:80/file"
      >>> files = {"file": open("/path/to/file/sample.doc")}
      >>> response = requests.post(url, files=files)
      >>> json.loads(response.content)
      
      {'mime-type': 'application/msword', 'ext': '.bin', 'text': 'Lorem ipsum \nLorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc ac faucibus odio.'}

      the general API response,json-schema is of the form:

      :mime-type (string) - the mime-type of the file. eg: application/msword, text/plain etc.
      :ext       (string) - the extension of the file. eg: .txt, .jpg etc.
      :text      (string) - the text content of the file.
      

    Note: The files being uploaded are stored as temp files, in /tmp and removed after an hour later. (assuming the jvm is still running for that hour or so).

    1. just a /, GET request returns Hello World as plain text. to act as ping.

    If going down the path of using docker-compose. The request gets altered to

    curl -XPOST  -H Host:tokyo.localhost http://localhost/file -F file=@/path/to/file/sample.doc
    
    {"mime-type":"application/msword","ext":".bin","text":"Lorem ipsum \nLorem ipsum dolor sit amet, consectetur adipiscing elit. Nunc ac faucibus odio."}

    and

    >>> response = requests.post(url, files=files, headers={"Host": "tokyo.localhost"})

    where tokyo.localhost has been mentioned in docker-compose.yml

    Why?

    I had to do this because neither Python’s filetype (doesn’t identify .doc, .docx, plain text), textract (hacky way of extracting text, and one needs to know the extension before extracting) are as good as Tika. The Go version, filetype didn’t support a way to extract text. So I resorted to spiraling down the path of using Java’s Apache Tika using the Clojure pantomime library.

    License

    Copyright © 2020 greed2411/tokyo

    This program and the accompanying materials are made available under the terms of the Eclipse Public License 2.0 which is available at http://www.eclipse.org/legal/epl-2.0.

    This Source Code may also be made available under the following Secondary Licenses when the conditions for such availability set forth in the Eclipse Public License, v. 2.0 are satisfied: GNU General Public License as published by the Free Software Foundation, either version 2 of the License, or (at your option) any later version, with the GNU Classpath Exception which is available at https://www.gnu.org/software/classpath/license.html.

    Visit original content creator repository https://github.com/greed2411/tokyo
  • XAdmin

    Laravel5-Backend – base Laravel5.3

    Build Status Software License

    此项目主要目的是为了搭建一套常用的基础服务,从而可以专注于其他的业务开发。 Theme已从原来的 ACE 升级为 AdminLTE 部分截图

    Requirements

    • A web server: Nginx
    • PHP 5.6.4+ with the following extensions: mbstring, pdo_mysql
    • MySQL
    • Composer
    • NPM
    • CNPM 国内源,可快速安装 如何安装cnpm
    • Bower
    • Gulp

    Installation

    git clone https://github.com/qloog/laravel5-backend.git
    cd laravel5-backend
    
    // 安装后端依赖组件
    composer config -g repo.packagist composer https://packagist.phpcomposer.com    // 使用composer中国镜像
    composer install -vvv               // 根据composer.json下载依赖包到vendor目录
    
    
    // 安装前端依赖组件
    cnpm install                        // install bower, gulp, laravel-elixir
    bower install -V                    // 安装前端组件
    npm run build                       // copy js/css/img 到public下
    php artisan vendor:publish --provider='Ender\UEditor\UEditorServiceProvider' //copy ueditor to public
    
    // 创建表及导入测试数据
    vim .env                            // 修改为自己的数据库信息
    php artisan migrate                 // 生成表结构
    php artisan db:seed                 // 填充测试数据
    
    // 开启server
    php artisan serve --port 8001       // 运行server
    open http://localhost:8001/admin/login  // 用户名: admin@test.com, 密码: 12345678

    Features

    • 用户管理(Done)
    • 角色管理(Done)
    • 权限管理(Role-Based Access Control)(Done)
    • 菜单管理
    • 操作管理
    • 日志管理

    Coding Style

    • PHP:遵循PSR规范

      • PSR1 Basic Coding Standard
      • PSR2 Coding Style Guide
      • PSR3 Logger Interface
      • PSR4 Autoloading Standard
      • PSR6 Caching Interface
      • PSR7 HTTP Message Interface

    Code check and fix

    PHPCS 检查代码规范

    // 单个文件, 可以快速查看某个文件符合PSR的情况
    ./vendor/bin/phpcs -p --standard=PSR2 --ignore=vendor  /path/to/file
    // 目录
    ./vendor/bin/phpcs -p --standard=PSR2 --ignore=vendor  /path/to/dir

    PHP-CS-FIXER 修复代码

    遵循PSR标准的代码格式化工具php-cs-fixer。 可通过composer安装:

    // 安装
    composer require friendsofphp/php-cs-fixer
    // 修复代码
    ./vendor/bin/php-cs-fixer fix app/Http/Controllers/Backend/UserController.php --level=psr2

    使用文档:

    Code Document

    按照phpdoc规范写注释,自动生成代码文档 phpDoc文档

    Command

    • 执行: php artisan make:repository Forum

    结果包含:

    app/Contracts/Repositories/ForumRepository.php
    app/Models/Forum.php
    app/Repositories/Eloquent/ForumRepositoryEloquent.php
    database/migrations/2016_10_28_121408_create_forums_table.php
    

    Tips

    参看:PHP之道

    ScreenShot

    登录页面 角色页面 添加新闻页面

    Issue

    • 欢迎发 issues 交流讨论
    • QQ交流群:32649336

    Thanks

    License

    The laravel5-backend is open-sourced software licensed under the MIT license

    Visit original content creator repository https://github.com/qloog/XAdmin
  • YoloV5-segmentation-ncnn-RPi4

    YoloV5 segmentation Raspberry Pi 4

    output image

    YoloV5 segmentation with the ncnn framework.

    License
    Special made for a bare Raspberry Pi 4, see Q-engineering deep learning examples


    Benchmark.

    Model size objects mAP RPi 4 64-OS 1950 MHz
    YoloV5n 640×640 nano 80 28.0 1.4 – 2.0 FPS
    YoloV5s 640×640 small 80 37.4 1.0 FPS
    YoloV5l 640×640 large 80 49.0 0.25 FPS
    YoloV5x 640×640 x-large 80 50.7 0.15 FPS
    Yoact 550×550 80 28.2 0.28 FPS

    Dependencies.

    To run the application, you have to:

    • A raspberry Pi 4 with a 32 or 64-bit operating system. It can be the Raspberry 64-bit OS, or Ubuntu 18.04 / 20.04. Install 64-bit OS
    • The Tencent ncnn framework installed. Install ncnn
    • OpenCV 64 bit installed. Install OpenCV 4.5
    • Code::Blocks installed. ($ sudo apt-get install codeblocks)

    Installing the app.

    To extract and run the network in Code::Blocks
    $ mkdir MyDir
    $ cd MyDir
    $ wget https://github.com/Qengineering/YoloV5-segmentation-ncnn-RPi4/archive/refs/heads/main.zip
    $ unzip -j master.zip
    Remove master.zip, LICENSE and README.md as they are no longer needed.
    $ rm master.zip
    $ rm LICENSE
    $ rm README.md
    Your MyDir folder must now look like this:
    parking.jpg
    busstop.jpg
    YoloV5-seg.cpb
    main.cpp
    yolov5n-seg.bin
    yolov5n-seg.param
    yolov5s-seg.bin
    yolov5s-seg.param


    Running the app.

    To run the application load the project file YoloV5-seg.cbp in Code::Blocks. More info or
    if you want to connect a camera to the app, follow the instructions at Hands-On.
    Many thanks to FeiGeChuanShu!
    output image


    paypal

    Visit original content creator repository https://github.com/Qengineering/YoloV5-segmentation-ncnn-RPi4
  • Collectibles-2Take1-Lua

    PROJECT ARCHIVED / FAREWELL

    On September 17, 2024, R* added the BattleEye Anti-Cheat to GTA Online, meaning I can no longer work on this or any of my other scripts. All of my 2Take1 Mod Menu scripts will now be archived.

    It’s been a blast-fun, rewarding, and full of good vibes-to learn scripting for GTA 5, making my own stuff. But, yeah, it was never exactly allowed, and all good things must come to an end.

    Thanks to everyone who supported my work, whether up close or from a distance. It really kept me motivated.

    If you were into my GTA scripts, you might be interested in a project I’m still working on: GTA-V-Session-Sniffer.


    Online Version Game Build

    Collectibles-2Take1-Lua

    Script to collect collectibles. Is in development but here are stable commits only.

    Screenshots

    Collectibles Menus:

    Main Menu GTA Online GTA Online > Collectibles GTA Online > Daily Collectibles
    Main Menu GTA Online GTA Online > Collectibles GTA Online > Daily Collectibles

    Credits

    Visit original content creator repository https://github.com/BUZZARDGTA/Collectibles-2Take1-Lua
  • srtgears

    Srtgears™

    Build Status Go Reference Go Report Card

    Srtgears™ is a subtitle engine for reading subtitle files, manipulating / transforming them and then saving the result into another file.

    Srtgears provides some very handy features which are not available in other subtitle tools, for example:

    • merge 2 subtitle files to have dual subs: one at the bottom, one at the top (this is not concatenation, but that’s also supported)
    • lengthen / shorten display duration of subtitles (if you’re a slow reader, you’re gonna appreciate this :))
    • remove hearing impaired (HI) texts (such as "[PHONE RINGING]" or "(phone ringing)")
    • strip off formatting (such as <i>, <b>, <u>, <font>)
    • split the subtitle file at a specified time
    • statistics from the subtitles
    • etc…

    Home page: https://srt-gears.appspot.com

    Presentation

    The Srtgears engine is presented in 3 ways:

    1. Command line tool

    Srtgears is available as a command line tool for easy, fast, scriptable and repeatable usage.

    Binary (compiled) distributions are available on the download page:

    https://srt-gears.appspot.com/download.html

    The command line tool uses only the Go standard library and the srtgears engine (see below).

    2. Web interface: online web page

    Srtgears can also be used on the web for those who do not want to download the tool just try it out from the browser. It can be found here:

    https://srt-gears.appspot.com/srtgears-online.html

    The web interface is a Google App Engine project, implemented using the Go AppEngine SDK. The server side of the web interface uses the srtgears engine (see below).

    The web folder is the root of the App Engine project. If you want to try it locally, you need to download the Go AppEngine SDK, and it can be started locally by running the goapp serve command of the SDK from the web folder.

    3. Srtgears engine: a Go package

    And last (but not least) a Go package for developers. The engine was designed to be independent from the command line and web interfaces, its API is clear, well documented and easy-to-use.

    To get the source code (along with the sources of the tool and web interface), use go get:

    go get github.com/icza/srtgears
    

    Documentation can be found at:

    http://godoc.org/github.com/icza/srtgears

    To use the engine, first import it:

    import "github.com/icza/srtgears"
    

    And for example using the engine to merge 2 subtitle files to have a dual sub saved in Sub Station Alpha (*.ssa) format:

    sp1, err := srtgears.ReadSrtFile("eng.srt")
    check(err) // Check / handle error
    sp2, err := srtgears.ReadSrtFile("hun.srt")
    check(err) // Check / handle error
    sp1.Merge(sp2)
    err = srtgears.WriteSsaFile("eng+hun.ssa", sp1);
    check(err) // Check / handle error
    

    You can see more usage examples in the package doc.

    Also worth noting that the subtitle transformations of the command line tool and the web interface are driven by the same Executor, it is “outsourced” to the github.com/icza/srtgears/exec package.

    Limits

    Input files must be UTF-8 encoded, output files will be UTF-8 encoded as well.

    Supported input format is SubRip (*.srt) only, supported output formats are SubRip (*.srt) and Sub Station Alpha (*.ssa).

    It should also be noted that SubRip format specification does not include subtitle positioning. Srtgears uses an unofficial extension {\anX} which may not be supported by all video players, or some players interpret the position values differently. MPC-HC has full support for it. In these cases the Sub Station Alpha output format is recommended (where the specification covers subtitle positioning / alignment).

    Story

    Srtgears started as a 48-hour work created for the Gopher Gala 2016 event. The initial version can be found here. I was competing solo.

    License

    See LICENSE

    Visit original content creator repository https://github.com/icza/srtgears
  • -kafCam-kafcam-core-

    kafCam

    Application created for fun and to play with Apache Kafka, Camunda and Hexagonal Architecture. At the beggining created as a monolith designed using hexagonal architecture, in time separate functionalities were moved to specific projects and comunicate via Kafka.

    We were looking for a way to play with a large number of data, as a result we find an api that exposes the currency rate in large numbers. Use cases are totaly ‘invented’ by us and were adjusted as the app was created to be more suitable.

    Use cases

    • User can get a currency rates from specified period of time
    • User can get a currency expertise for specified currency from specified period of time
    • Currency expertise is issued based on internal business logic and requires opinion from ‘expert’

    Technical assumptions

    • Currency ratings are persisted in Kafka topic
    • Application is Event Driven, Kafka is used for communication between microservices
    • Camunda BPMN is used for issuing the currency opinion

    Setup

    • Application requires Apache Kafka
    • MongoDb is used as a persistence for result of user request and also as a Kafka snapshot
    • cam, kafprod, kafwork – multiple instances of this can be run simultaneously for scalability
    • Kafka need 5 topics for setup use script:
      Linux
      Windows

    Project structure

    kafcam – is single point of entry, it is responsible for handling the user requests via REST api, orchestration of the use cases, making the snaphot of the data from Kafka topic

    cam – Camunda microservice, executes the BPMN

    kafprod – Responsible for communication with the external API, act as a producer for Kafka currency topic

    kafwork – Handles the request for data that is not available in the snapshot, searches through the Kafka topic to find specific data for specific currency

    shared-domain – Through the process of separating the functionalities from monolith to separate microservices we encountered duplication of domain classes. This project contains domain classes for use by other projects, includes as a dependency.

    Authors

    Visit original content creator repository
    https://github.com/THRuM/-kafCam-kafcam-core-

  • PTBDWBC

    Repositório da disciplina de Desenvolvimento web: Cliente do terceiro semestre do curso de Análise e Desenvolvimento de Sistemas do IFSP-Pirituba.

    Ementa1

    O componente curricular aborda os comandos de linguagens utilizadas para a criação e otimização de sites na web, por meio da criação de páginas dinâmicas e adição de recursos multimídia. O escopo do componente curricular restringe-se aos desafios presentes no Cliente de uma arquitetura Cliente-Servidor.

    Objetivos1

    ✔️ Capacitar o aluno a implementar projetos de banco de dados.

    ✔️ Compreender a arquitetura Cliente-servidor utilizada na rede mundial de computadores.

    ✔️ Conhecer elementos básicos para o desenvolvimento de páginas estáticas e dinâmicas.

    ✔️ Serviços na nuvem para o desenvolvimento de páginas web.

    ✔️ Construir páginas web a partir da utilização de linguagem de marcação de hipertexto.

    ✔️ Utilizar linguagem de programação baseada em scripts.

    ✔️ Criar folhas de estilos em cascata e utilizá-las na construção da identidade visual de
    aplicações web.

    Conteúdo Programático1

    • Histórico, evolução e funcionamento da internet;

    • Etapas da construção do projeto de websites;

    • Linguagem de marcação para formatação de conteúdo;

    • Folhas de estilo em cascata;

    • Linguagem baseada em scripts;

    • Ferramentas, frameworks e bibliotecas para desenvolvimento de páginas web.

    Bibiografia básica1

    ROBSON, Elisabeth; FREEMAN, Eric. Use a cabeça!: HTML e CSS. Rio de Janeiro: Alta Books, 2015. 723 p. (Use a cabeça!). ISBN 9788576088622.

    DUCKETT, Jon. HTML & CSS: projete e construa websites. Rio de Janeiro: Alta Books, 2016. 512 p. ISBN 9788576089391.

    GRINBERG, Miguel. Desenvolvimento web com flask: desenvolvendo aplicações web com Python. São Paulo: Novatec, 2018. 310 p. ISBN 9788575226810.

    IEEE Internet Computing. Electronic ISSN: 1941-0131 Print ISSN: 1089-7801. Disponível em: https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=4236.

    Bibliografia Complementar1

    FLATSCHART, Fábio. HTML 5: embarque imediato. Rio de Janeiro: Brasport, 2011. ISBN 9788574525778.

    MARINHO, Antonio Lopes (Org.). Desenvolvimento de aplicações para internet. São Paulo: Pearson. 2019. ISBN 9788543020112.

    FREEMAN, Eric; ROBSON, Elisabeth. Use a cabeça!: programação em HTML 5: desenvolvendo aplicativos para web com Javascript. Rio de Janeiro: Alta Books, 2014. 573 p. (Use a cabeça). ISBN 9788576088455.

    DEITEL, Paul J.; DEITEL, Harvey M. Ajax, Rich internet applications e desenvolvimento web para programadores. São Paulo: Pearson, 2008. ISBN 9788576051619.

    DUCKETT, Jon. Javascript e Jquery. Desenvolvimento de Interfaces web interativas. 1. ed. Rio de Janeiro: Alta Books, 2016.

    Footnotes

    1. Fonte das informações 2 3 4 5

    Visit original content creator repository
    https://github.com/Gabriel-Lucena/PTBDWBC

  • Blog-Generation-Platform

    Blog-Generation-Platform-With-LLAMA

    Description:

    This repository contains code for generating blog content using the LLama 2 language model. It integrates with Streamlit for easy user interaction. Simply input your blog topic, desired word count, and writing style to generate engaging blog content.

    This GitHub code demonstrates the use of the LLama 2 model for generating blog content using Streamlit. It imports the CTransformers module from langchain.llms and langchain_community.llms for language model integration. The get_response function prompts the user to input a blog topic, number of words, and style, then generates a response using the LLama 2 model. Streamlit is used to create a user-friendly interface for input and output.

    The LLama 2 model is specified with its model file and type, along with configuration parameters such as max_new_tokens and temperature for text generation.

    Streamlit is configured to create a centered layout with collapsed initial sidebar state. Users can input the blog topic, number of words, and select the style of writing (Research, Creative, Technical) before generating the response.

    Finally, upon clicking the “Generate” button, the model generates a response based on the provided inputs and displays it using Streamlit.

    🚀 Features:

    – Easy-to-use interface with Streamlit

    – Integration with LLama 2 model for generating diverse and context-aware blog content

    – Customize writing style for Research, Creative, or Technical topics

    Example

    alt text

    Download requirements:

    sentence-transformers uvicorn ctransformers langchain python-box streamlit

    Download model from Hugging Face 👈

    alt text

    Steps to run

    1. Import Statements:
      • You import the CTransformers module from both langchain.llms and langchain_community.llms. Ensure that you need both imports and that they are correctly referencing the desired functionality.
    pip install langchain
    pip install langchain_community
    pip install CTransformers
    pip install streamlit
    1. Function Definition (get_response):

      • This function is designed to retrieve a response from the LLama 2 model based on user inputs such as the blog topic, number of words, and writing style.
      • It initializes the LLama 2 model with the specified parameters, including the model file, model type, and configuration options such as max_new_tokens and temperature.
      • You define a prompt template to guide the model in generating the blog content based on the provided inputs.
      • The function then generates a response from the LLama 2 model using the prompt template and returns it.
    2. Streamlit Setup:

      • You configure the Streamlit page with a title and layout settings.
      • The user is presented with input fields to enter the blog topic, number of words, and select the writing style (e.g., Research, Creative, Technical).
      • Upon clicking the “Generate” button, the function get_response is called with the provided inputs, and the generated response is displayed on the Streamlit interface.
    3. Final Response:

      • The generated response from the LLama 2 model is displayed in the Streamlit interface for the user to review.
      • Run the command in terminal
      •  streamlit run webapp.py
    Visit original content creator repository https://github.com/AkashKobal/Blog-Generation-Platform
  • fractals-with-chaos-game

    Fractals with Chaos Game

    Link to hosted project: Fractals with Chaos Game

    Fractal patterns are created using chaos game

    Details for all fractals can be found at the bottom.

    Made with JavaScript and visualized with the p5.js library

    Most general definition:
    There are fixed vertices which can be defined by the vertices of a shape, for example a triangle. One vertex is chosen as the starting point then another random vertex is chosen. The midpoint is found between the two points and a dot is drawn there. This midpoint becomes the new starting point and another vertex is chosen at random. The midpoint is then found again and a dot is drawn. A pattern appears as these steps continue. This algorithm creates fractals such as: the Sierpinski Triangle, Sierpinski Carpet and Vicsek Fractal. From a seemingly random algorithm, interesting fractals appear.

    With 3 vertices, a Sierpinski Triangle forms:

    Other interesting results may arrise when more restrictions are placed, vertices are added and the jump deviates from the midpoint


    Restricted Regions


    The area in the centre is restricted for these fractals. Points cannot be placed within the central symbol which also prevents them from existing within the smaller ones. This creates a fractal pattern of symbols. Any symbol can be used.


    Restricted Movement


    Restrictions are set for what the next point can be based on what the previously chosen point was.


    Additional Vertices & Non-midpoint Jumps


    New verticies are added and the distance travelled to the next chosen point is not exactly at the midpoint.


    Five Vertices are Chosen in the Shape of a Pentagon


    On the left, the chosen point cannot be chosen again in the next iteration.
    On the right, the point jumps a distance 1 / PHI to the chosen point.


    Result Restriction
    Sierpinski Triangle:
    3 vertices
    Points cannot be placed within the Nike logo
    Same as above but a Pi symbol is used
    Same as above but a treble clef is used
    The chosen point cannot be chosen again in the next iteration
    The chosen point cannot be diagonal from the previous point
    The point chosen cannot be ajacent and anti-clockwise to the previous point
    Sierpinski Carpet:
    The point jumps 2/3 of the way to the chosen point
    The midpoints along the edges are considered vertices
    Vicsek Fractal:
    The point jumps 2/3 of the way to the chosen point
    The centre is also considered a vertex
    The chosen point cannot be chosen again in the next iteration (5 vertices)
    The point jumps 1/PHI of the way to the chosen point

    Sources:
    https://en.wikipedia.org/wiki/Chaos_game

    Visit original content creator repository https://github.com/tansonlee/fractals-with-chaos-game
  • imap_maildir_date_fixer

    IMAP Maildir Date Fixer

    This is a shell script to fix up date problems with IMAP mail repositories maintained by Dovecot email servers that use the Maildir format. There are separate versions for FreeBSD and Linux — the date and touch commands are quite different in the two systems.

    The Problem

    Some mail clients, such as macOS Mail and Microsoft Outlook for Macintosh, display incorrect dates for mail messages stored on a Dovecot IMAP mail server that uses the Maildir format to store emails.

    The reason for the incorrect dates is that the clients do not look at a mail’s reception or transmission date – instead, they use the modification date of the file containing the mail. If the modification date is different to the reception or transmission date, then the mail is tagged with the wrong date.

    The modification date can inadventently be changed if, for example, the mail repository is copied without preserving the modification date.

    The Solution

    The solution sounds pretty simple: extract each email’s actual date from its mail headers and use this to reset the file’s modification date. That’s just what this script does. But there’s a little more to it:

    1. The script recursively crawls the directory you specify looking for all directories with the name cur.
    2. For each cur directory, it treats all the files in it at files containing emails, extracts a date from each one and sets the file’s modification date to that.
    3. It deletes the dovecot.index, dovecot.index.cache, dovecot.list.index files from the cur directory’s parent directory. This causes the mail system to rebuild them using the now-updated file modification dates, thus fixing the problem on the server.

    Solving the Problem

    Before you begin, make sure the script is executable, e.g. on FreeBSD by doing:

    $ chmod 755 fiximapdates_bsd.sh
    
    1. Run the script in supervisor mode, giving it the parent directory as an argument – see below for some examples.
    2. Restart dovecot to make it use the updated information.

    In the following example, the mail server is set up to use Virtual Users, and all email is stored in /var/mail/vhosts. Individual domains are further in, and individual users are further in again. For example, to fix up the dates for user joe in the domain domain.com:

    # ./fiximapdates_bsd.sh /var/mail/vhosts/domain.com/joe
    

    This will trawl through all the mail in all the cur directories belonging to the user joe@domain.com on a FreeBSD system. It does not look in the new or tmp directories. It gives an indication of progress when updating a directory of mail files.

    # ./fiximapdates_linux.sh /var/mail/vhosts/domain.com/
    

    This will trawl through all the mail of all users with email accounts on this server for domain.com on a Linux system.

    # ./fiximapdates_linux.sh /var/mail/vhosts
    

    This will trawl through all the mail of all users with email accounts on a Linux server.

    Using the Script

    Here is the start of a log of a session with fiximapdates_bsd.sh traversing a 30 GB Maildir repository. The last line is a progress indicator.

    # ./fiximapdates_bsd.sh /var/mail/vhosts/domain.com
    Processing "/var/mail/vhosts/domain.com/administrator/.Apple Mail To Do"
    Processing "/var/mail/vhosts/domain.com/administrator/.Archive"
    Processing "/var/mail/vhosts/domain.com/administrator/.Deleted Messages (Administrator@Home)"
        Updating 13 mail files in "/var/mail/vhosts/domain.com/administrator/.Deleted Messages (Administrator@Home)"...
    Processing "/var/mail/vhosts/domain.com/administrator/.Deleted Messages"
        Updating 7 mail files in "/var/mail/vhosts/domain.com/administrator/.Deleted Messages"...
    Processing "/var/mail/vhosts/domain.com/administrator/.Drafts"
    Processing "/var/mail/vhosts/domain.com/administrator/.Fan Mail"
        Updating 5681 mail files in "/var/mail/vhosts/domain.com/administrator/.Fan Mail"...
    Processing "/var/mail/vhosts/domain.com/administrator/.Junk"
        Updating 2 mail files in "/var/mail/vhosts/domain.com/administrator/.Junk"...
    Processing "/var/mail/vhosts/domain.com/administrator/.Liz's Stuff"
        Updating 90 mail files in "/var/mail/vhosts/domain.com/administrator/.Liz's Stuff"...
    Processing "/var/mail/vhosts/domain.com/administrator/.Notes"
    Processing "/var/mail/vhosts/domain.com/administrator/.Sent Messages"
        Updating 41 mail files in "/var/mail/vhosts/domain.com/administrator/.Sent Messages"...
    Processing "/var/mail/vhosts/domain.com/administrator"
        Updating 4117 mail files in "/var/mail/vhosts/domain.com/administrator"...
    Processing "/var/mail/vhosts/domain.com/david/.Apple Mail To Do"
    Processing "/var/mail/vhosts/domain.com/david/.Archive"
    Processing "/var/mail/vhosts/domain.com/david/.Deleted Messages"
        Updating 3 mail files in "/var/mail/vhosts/domain.com/david/.Deleted Messages"...
    Processing "/var/mail/vhosts/domain.com/david/.Drafts"
    Processing "/var/mail/vhosts/domain.com/david/.Junk"
    Processing "/var/mail/vhosts/domain.com/david/.Notes"
    Processing "/var/mail/vhosts/domain.com/david/.Sent Messages"
        Updating 2870 mail files in "/var/mail/vhosts/domain.com/david/.Sent Messages"...
    Processing "/var/mail/vhosts/domain.com/david"
        Updating 3834 mail files in "/var/mail/vhosts/domain.com/david"...
    Processing "/var/mail/vhosts/domain.com/david/.Apple Mail To Do"
    Processing "/var/mail/vhosts/domain.com/david/.Archive"
    Processing "/var/mail/vhosts/domain.com/david/.Deleted Messages"
        Updating 2 mail files in "/var/mail/vhosts/domain.com/david/.Deleted Messages"...
    Processing "/var/mail/vhosts/domain.com/david/.Drafts"
    Processing "/var/mail/vhosts/domain.com/david/.Junk"
    Processing "/var/mail/vhosts/domain.com/david/.Notes"
    Processing "/var/mail/vhosts/domain.com/david/.Sent Messages"
        Updating 26 mail files in "/var/mail/vhosts/domain.com/david/.Sent Messages"...
    Processing "/var/mail/vhosts/domain.com/david"
        Updating 341 mail files in "/var/mail/vhosts/domain.com/david"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20090731.3BA29-3D4"
        Updating 10 mail files in "/var/mail/vhosts/domain.com/foobar/.20090731.3BA29-3D4"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20090731.Inbox"
        Updating 4250 mail files in "/var/mail/vhosts/domain.com/foobar/.20090731.Inbox"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20090731.Re WBO"
        Updating 19 mail files in "/var/mail/vhosts/domain.com/foobar/.20090731.Re WBO"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20090731.Sent"
        Updating 1193 mail files in "/var/mail/vhosts/domain.com/foobar/.20090731.Sent"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20090731"
    Processing "/var/mail/vhosts/domain.com/foobar/.20100731.Inbox"
        Updating 1535 mail files in "/var/mail/vhosts/domain.com/foobar/.20100731.Inbox"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20100731.Sent"
        Updating 788 mail files in "/var/mail/vhosts/domain.com/foobar/.20100731.Sent"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20100731"
        Updating 1 mail files in "/var/mail/vhosts/domain.com/foobar/.20100731"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20120731.Inbox"
        Updating 8115 mail files in "/var/mail/vhosts/domain.com/foobar/.20120731.Inbox"...
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091506.M290526P89101.alix.domain.com,S=135487:2,Sab -- can't translate date "Thu, 2 Dec 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091506.M290527P89101.alix.domain.com,S=135314:2,Sab -- can't translate date "Thu, 2 Dec 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091516.M825619P89101.alix.domain.com,S=134792:2,S -- can't translate date "Thu, 25 Nov 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091549.M592740P89101.alix.domain.com,S=2948740:2,Sab -- can't translate date "Tue, 30 Nov 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091558.M369413P89101.alix.domain.com,S=137645:2,Sab -- can't translate date "Mon, 1 Nov 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091577.M492010P89101.alix.domain.com,S=134729:2,S -- can't translate date "Thu, 25 Nov 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091605.M665664P89101.alix.domain.com,S=134458:2,Sab -- can't translate date "Fri, 10 Dec 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091624.M378649P89101.alix.domain.com,S=134391:2,S -- can't translate date "Wed, 15 Dec 2010".
        File /var/mail/vhosts/domain.com/foobar/.20120731.Inbox/cur/1351091643.M679049P89101.alix.domain.com,S=279547:2,Sab -- can't translate date "Tue, 30 Nov 2010".
    Processing "/var/mail/vhosts/domain.com/foobar/.20120731.Sent"
        Updating 4108 mail files in "/var/mail/vhosts/domain.com/foobar/.20120731.Sent"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20120731"
        Updating 1 mail files in "/var/mail/vhosts/domain.com/foobar/.20120731"...
    Processing "/var/mail/vhosts/domain.com/foobar/.20130731.Inbox"
        Updating 6716 mail files in "/var/mail/vhosts/domain.com/foobar/.20130731.Inbox"...
    Files checked: 1680
    ...
    

    Finishing Up

    Once you have repaired the information on the mail server and restarted it to make use of the updated information, you need to update the clients. This can be messy because the clients typically store local copies of the mail that will now be out of date. One easy way is simply to completely delete and recreate the client account. (Note, it’s probably not enough to disable and then re-enable the account, as the local copies may be preserved when the account is disabled and returned to use when the account is re-enabled.)

    Limitations

    1. It only fixes up problems due to incorrect modification dates on Maildir files.
    2. It maps dates and times to local time using the date utility in FreeBSD and the GNU date input facilities in Linux.
      Linux is somewhat more flexible in its interpretation of date strings, but it can miss faulty strings. For instance, a date string without a time or zone information will be accepted by Linux but rejected by FreeBSD.
    3. It has not been tested in different locales. Things to watch out for are incorrect times and problems with non-English dates and date formats.
    4. It has not been extensively tested. Back up your mail repository before using it.
    5. It uses the first date it finds in a mail file, which may not always be exactly right.
    6. It’s a bit slow.

    Visit original content creator repository
    https://github.com/mikebrady/imap_maildir_date_fixer