Me - photo

Hi! My name is Pedro. I'm a Computer Science working mainly with web development. This is my personal webpage and portfolio. I'm currently located in São Paulo, Brazil.

  • (2017-2021) - GL Consultoria

    On my currently job I had work mainly with web development, but some no-programming activities are/were:

    • Helping with the integration of a new system for the company. The old system was outdated and without assistence.
    • Participating on meetings and electronic trading system for acquiring new services.
    • Participating as coordinator for activities that the company was contract for. The responsibilities involved usually are:
      • Traveling and preparing the place where the service will be done.
      • Giving training for new incomers about the procedures of the job.
      • Handling occasional problems that may raise during the service.
      • Making the employees payment for the contracted service - the service duration is no longer than one day of work.
    • Elaborating questions with topic on Technology. This questions are used on the services that the company is contract for.

    When not working with no-programming activities I had the chance to work with some lasted web technology demonstrated on the following pet projects:

    • Chute do Brasileirao - check out on

      This is a small Ruby on Rails website to show informations about the CBF (a brazillian soccer championship). To accomplish that I schedule a job to scrap data from the CBF official page regularly. The webpage also let you bet on the score for the matches of the day. Some noticeable features of this project are:

      • Because the scraped data doesn't change often, I manually fetch the data on database and cache it on Redis. To be more precisely, I cache the partial view with data. It's very similar to how Rails cache partials, the difference here is that I don't hit the database everytime to check for updates.
      • I work with Active Storage to save the Team's flag. When scraping data we save the URL for accessing the Team's flag image. Later, using a Sidekiq job, we download it, locally, and resize it, for performance reasons, then we save on AWS S3.
      • I use the framework StimulusJS to handle the library SwiperJS to create sliders for the next/previous championship's matches. While SwiperJS offers a great API for working with sliders, it doesn't offer the logic to load dynamically data to the sliders.
      • The website let you login with Twitter or Google accounts. I also create an API, with api-key for login, to offer the scraped data.
      • You can find more info about the project on the README page on github.

      This project was build with:

      • Ruby on Rails
      • StimulusJS
      • Bootstrap CSS
      • Postgres
      • Redis
      • Sidekiq
      • RabbitMQ
      • Docker
      • Nginx
    • ScrapCbf - check out on

      This project is a Ruby gem useful for scraping data from the CBF official page. Some of the data scraped are: championship's matches, ranking table and teams. The CBF page has data from the championship 2012 until the current one. They also update, daily, the matches and ranking table for the current championship. This project was build with:

      • Ruby
      • Nokogiri (for scraping data)
    • ScrapCbfRecord - check out on

      This project is a Ruby gem and complements the gem ScrapCbf. It gets ScrapCbf's output and saves on database. Right now, it only has support for the ORM Active Record. This project offers some flexibility on editing the data before saving: it will let you name the models for the scraped entities, rename the entities's attributes, delete the entities's attributes and create or not database associations between entities's attributes.

    • Personal Webpage - check out on

      This is my personal webpage builded with Gatsby. I'm still building it and right now I have only this page, but I plan to add a page for post my personal experiences and thoughts. This project was build with:

      • React
      • Tailwind CSS
      • Docker
      • Nginx
  • (2012-2017) - Graduation in Computer Science from UEL (Universidade Estadual de Londrina)

    My final thesis was on the field of Computer Security. The thesis's main idea consisted of applying the modularity algorithm to a dataset of alerts generated by an IDS (Intrusion Detection System) . This dataset was generated during a CTF contest (CTF is a cyber hacking competition that stand for Capture the Flag) in 2008, and the winner team gave an interview explaning how they proceeded to hack the computers located on a private network and capture the flag. To apply the modularity algorithm on the dataset we had to create a Graph using the alerts's IP address Source and Destiny. The result from applying the modularity algorithm on this graph, and with the help of an algorithm for visualizing graphs, we obtained the following image:

    Tcc image

    See notes 1 and 2 about the image.

    Short description of the image: the vertices represent computers (we call them hosts for now on) and the edges the interactions among them. The colours are added by the algorithm for visualizing graphs and used to distinguish group of hosts with common interactions among them (these group are the result of applying the modularity algorithm to the graph). The vertices with high number of edges and isolated from the group of vertices are the hosts that were compromised or have experienced multiple attempts of attack.


    Final, with the detailed interview given by the winner team of the CTF contest we could understand how the attacks and compromises were made, and, also, prove our hypotheses from solely analyzing the image of the graph. From the image we were able to identify the hosts that were compromised and the ones under attack. A real-time and interactively application using this technique would be able to show, visually, the chronological order of attacks and compromised hosts.


    Note 1.: I had to reduce the quality of the image because of the file size. Because of that, one edge is missing connecting the green vertice isolated and closest to the red vertices, to the red vertice isolated and centered between the group of red vertices and the group of red and purple vertices. This edge is important because it identify the first computer comprimised and the connection to the second one compromised.


    Note 2.: Because this is a small and no-interactive image, I had to remove the vertice's IP address from the the image. The IP addresses are useful to identify each computer on the network and, also, to discovery the compromised computers using the detailed interview given by the winner team of the CTF contest.

pedrogglima © 2021-2021