Skip to content

Instantly share code, notes, and snippets.

0x10f1fea2636780BA882F518ca075dA90FE209210
@zioalex
zioalex / configure_cuda_p70.md
Created August 26, 2019 19:46 — forked from alexlee-gk/configure_cuda_p70.md
Use integrated graphics for display and NVIDIA GPU for CUDA on Ubuntu 14.04

This was tested on a ThinkPad P70 laptop with an Intel integrated graphics and an NVIDIA GPU:

lspci | egrep 'VGA|3D'
00:02.0 VGA compatible controller: Intel Corporation Device 191b (rev 06)
01:00.0 VGA compatible controller: NVIDIA Corporation GM204GLM [Quadro M3000M] (rev a1)

A reason to use the integrated graphics for display is if installing the NVIDIA drivers causes the display to stop working properly. In my case, Ubuntu would get stuck in a login loop after installing the NVIDIA drivers. This happened regardless if I installed the drivers from the "Additional Drivers" tab in "System Settings" or the ppa:graphics-drivers/ppa in the command-line.

2020-01-15T18:02:39.472+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: -----------------------------------------------------
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: 2020/01/15 18:02:40 [DEBUG] [aws-sdk-go] DEBUG: Response acm/ImportCertificate Details:
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: ---[ RESPONSE ]--------------------------------------
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: HTTP/1.1 400 Bad Request
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: Connection: close
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: Content-Length: 154
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: Content-Type: application/x-amz-json-1.1
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-provider-aws_v2.41.0_x4: Date: Wed, 15 Jan 2020 17:02:39 GMT
2020-01-15T18:02:40.138+0100 [DEBUG] plugin.terraform-pr
#! /usr/bin/env bash
# normally first char of prename + surname
users="${@:-jsmith}"
# normally users
group="users"
# normally adm and/or sudo
groups="adm,sudo"
@zioalex
zioalex / week4.md
Last active June 26, 2022 12:14
mlops_zoomcamp

Model deployment

  • batch deployment or offline deployment - Not realtime
  • Online deployment - the model is always running
    • Web service
    • Streaming

Batch mode

Every X time - regularly

@zioalex
zioalex / Crash on Azure Devops #11946
Last active August 16, 2022 13:11
Packer Crash in Azure - panic: runtime error: invalid memory address or nil pointer dereference #11946
2022-08-16T11:29:15.0720978Z ##[debug]**************************************/
2022-08-16T11:29:15.0721626Z ##[debug]Match start index: -1
2022-08-16T11:29:15.0722256Z ##[debug]Match start index: -1
2022-08-16T11:29:15.0722883Z ##[debug]Match start index: -1
2022-08-16T11:29:15.0723502Z ##[debug]Match start index: -1
2022-08-16T11:29:15.0724128Z ##[debug]Match start index: -1
2022-08-16T11:29:15.0724751Z ##[debug]Match start index: -1
2022-08-16T11:29:30.2881807Z ==> azure-arm:
2022-08-16T11:29:30.2883346Z ==> azure-arm: Cleanup requested, deleting resource group ...
2022-08-16T11:29:30.2883609Z
1. # create new .py file with code found below
2. # install ollama
3. # install model you want “ollama run mistral”
4. conda create -n autogen python=3.11
5. conda activate autogen
6. which python
7. python -m pip install pyautogen
7. ollama run mistral
8. ollama run codellama
9. # open new terminal