See genUsage.txt for general usage instructions and comments
Directory Structure
Ordered by project workflow (e.g., not alphabetical)
Directory: partitionDemos/
Contains demo code for implementing the Cole-Anticevic (CA) network partition to organize functional connectivity (FC) estimates (e.g., adjacency matrices) into a network community structure. Also contains demo code for assessing empirical resting-state reassignments in relation to the CA partition and applying this empirically adjusted partition to task-evoked FC data (and to be used for analyses represented in the following directories)
Directory: restTaskFC_Comparisons/
Contains MATLAB functions for computing resting-state FC and task-state FC with Pearson’s Correlation Coefficient
Contains demo code for assessing rest-to-task changes in FC; e.g., descriptive analyses of changes to FC architecture between resting state and task-evoked state(s); as opposed to mechanistic and/or functional analyses (see below)
Directory: networkDiagnosticMetrics/
Contains demo code for diagnosing network-mechanisms of interest. Diagnostic metrics demonstrated include: global variability coefficient (GVC), between-network variability coefficient (BVC), and network flexibility (NF)
See networkDiagnosticMetrics/References.md for a full reference list associated with each metric (and additional info on public repositories outside of github)
Directory: cartographicMethod/
Contains demo code for implementing our cartographic representation of cognitive control system functioning, vis-a-vis the previously demonstrated analyses and diagnostic metrics
Directory: sampleResults/
Contains sample figures and videos associated with the project herein. See sampleResultsText.txt for associated text.
I am not taking security, performance, concurrency and among other concerns into consideration.
I am also aware of low unit testing code coverage.
Improvements
Repository method names should be replaced with ones that meet the good practices.
Swagger URL root should be replaced with a CONSTANT.
Replace HTTP response codes with the right ones.
In order to make sure the right balance I strongly recommend use Event Sourcing as an approach to capture every changing state.
Add authentication in order to protect the routes as well.
Allow creating new users.
I am not taking security, performance, concurrency and among other concerns into consideration.
I am also aware of low unit testing code coverage.
Improvements
Repository method names should be replaced with ones that meet the good practices.
Swagger URL root should be replaced with a CONSTANT.
Replace HTTP response codes with the right ones.
In order to make sure the right balance I strongly recommend use Event Sourcing as an approach to capture every changing state.
Add authentication in order to protect the routes as well.
Allow creating new users.
Excel Add-In for Streamlined Stock Taking and Tallying
Welcome to the MRF Utility—an advanced Excel add-in designed to optimize stock-taking and tallying processes. Whether you’re managing stock data or automating workflows, this tool ensures precision and saves valuable time.
🚀 Key Features
Stock Taking Program
Easily manage stock-taking sheets by product groups and efficiently tally data for the INS Tube Flap.
Customizable Reports
Export data directly from SAP transactions (ZNBPSTK, ZDL) and process it seamlessly in Excel.
Set Checker Tool
Validate and cross-check set billings with a single click for accurate reporting.
Pull Compliance Tool
Automate the FDC to SOF workflows using predefined templates, streamlining data handling.
📋 Getting Started
Prerequisites
Microsoft Excel 2010 or Later
Ensure compatibility with the add-in.
Trust Center Settings
Modify these settings during the initial setup (one-time requirement).
🛠️ Installation Steps
Download the latest release of the add-in from the GitHub Releases page.
Locate the .xlam file, right-click it, and select Properties. Under Security, check Unblock, and click OK.
Open Excel and navigate to:
File > Options > Add-ins > Manage: Excel Add-ins > Go...
Click Browse, locate the .xlam file, and click OK to add it.
Access the utility via the M R F tab in the Excel Ribbon.
⚙️ How to Use
Export Data from SAP
Generate spreadsheets from relevant SAP transactions (e.g., ZNBPSTK, ZDL).
Utilize the MRF Utility Features
Stock Taking Program: Use the M R F tab to execute the INS Tube Flap tallying process.
Set Checker Tool: Validate and report on set billings effortlessly.
Pull Compliance Tool: Manage FDC to SOF workflows with ease using predefined templates.
🤝 Contributing
If you know Excel Programming and wants to improve and add your own new features. Follow these steps to enhance this tool:
Fork this repository.
Create a new branch for your feature:
git checkout -b feature/your-feature-name
Commit your changes:
git commit -m "Add your feature"
Push the branch:
git push origin feature/your-feature-name
Submit a pull request for review.
💬 Support
Have questions, suggestions, or issues? Feel free to reach out via the Issues section on GitHub.
API to API Connection is federated through Identity Server. We are using Client Credential bearer token authentication model for this. For this we need to create 2 API projects. Let’s say API-A and API-B. Then we need to create an Identity Server to sit in the middle and federate secure access.
We need to communicate to an endpoint in API-B from API-A
Configure API Project A
Create a new ASP.NET Core WebAPI Project. We call it it API-A
Configure API Project B
Create a new ASP.NET Core WebAPI Project. We call it it API-B
Configure Identity Server
Create a new ASP.NET Core MVC Project. We call it it Identity Server
CONFIGURE IDENTITY SERVER
Create a new ASP.NET Core MVC Project. We call it it Identity Server
Now we need to grab contents from AppSettings.json to be provided to Identity Server 4 in meaningfull format. Let’s create a model similar to AppSettings.json provided above for parsing
Now we are going to implement custom “Resource Owner Password” validatior. There we try to check if the user is logged in or not using AspNet Identity. Let’s create an extention methord that can be attached to IdentityServer builder in Startup.cs file
Add a DBContext class to work with EF 6
*Also don’t forget to add custom classes for these table declarations. They are used for EF 6 migrations and ORM mappings, LINQ and quering DB
namespaceAPIA.EF{publicclassAppDbContext:IdentityDbContext{publicAppDbContext(DbContextOptions<AppDbContext>options):base(options){}//Required database tables can come below as DbSet<T>}}
Controller
Using the HttpContext you will get the logged in user’s details
namespaceAPIA.Controllers{publicclassHomeController:Controller{privatereadonlyUserManager<IdentityUser>_userManager;publicHomeController(UserManager<IdentityUser>userManager){_userManager=userManager;}publicIActionResultIndex(){returnView();}[Authorize]publicasyncTask<IActionResult>OpenBox(){// We will get all inoformations of logged in user here including claimsvaruserInfo=await_userManager.GetUserAsync(HttpContext.User);returnOk("Yeahhh");}}}
From this implementation (_userManager.GetUserAsync(HttpContext.User);). You will get information about the logged in client if he iuses ResourceOwner password validaton as GrandType
CONFIGURE API-B
Lets configure API-B that can be used to call API-A. Most of the configurations are same. Let’s create another WebAPI project that we can call APIB
NeuVector Full Lifecycle Container Security Platform delivers the only cloud-native security with uncompromising end-to-end protection from DevOps vulnerability protection to automated run-time security, and featuring a true Layer 7 container firewall.
This script is for demo purposes only. It deploys a bare minimum, single node K3s Kubernetes cluster, Longhorn Storage, and the Beta of Neuvector and provides links to the interfaces and login information.
Prerequisites
Ubuntu 20.04+ Server
Minimum Recommended 4vCPU and 8GB of RAM (Try Hetzner or DigitalOcean)
This is a web application built with Flask that allows users to upload image files, extract the text from these files, generate a word cloud from the extracted text, and then serve the word cloud image for download.
Features
Text extraction from different file types:
Images (.png): The pytesseract library, an OCR (Optical Character Recognition) tool
Word cloud generation: The word cloud library generates a word cloud from the combined text extracted from all uploaded files. The word cloud is saved as a PNG image with a unique filename, generated using the uuid library.
File download: The Flask send_from_directory function serves the generated word cloud image for download when the user navigates to the appropriate download URL.
How It Works
When a user navigates to the root URL, the index() function is called to handle the request. If the request method is POST, which means the user has submitted the form with files, the application checks if the user consented to process the files.
If consent is given, the application gets a list of uploaded images from the form data and processes each image. The text extracted from each file is added to a combined text string.
After all files have been processed, the application generates a word cloud from the combined text and saves it as a PNG image in a static directory. The user is then redirected to a download URL to download the word cloud image.
Running the Application
To run the application, use the command python/ python3 app.py in the terminal, where app.py is the file name containing this application. This will start the Flask web server, and the application will be accessible in a web browser at the URL localhost:5000 or 127.0.0.1:5000 unless otherwise specified.
Remember to install the necessary libraries (Flask, pytesseract, wordcloud) using pip/pip3 before running the application.
In this repo you’re going to find a VPC created with Terraform. This repo was made with the purpose of using some of the services of AWS.
Technologies
This project is working with Terraform
Note
The whole project was created on Linux (Ubuntu), so make sure to run it on similar OS. Also, you can modify the path given on main.tf according to your OS to avoid errors.
Prerequisites
You must have installed terraform on the computer that you’re trying to run the project, make sure to follow the documentation below:
Step 3: Verify if the configuration is valid or not
terraform validate
Step 4:
terraform plan
Warning
If you have an error trying this command, make sure you have the files config and credentials on the directory .aws, if not make sure to follow the documentation for aws configure command.
Step 5: Create or update the infrastructure
terraform apply
Step 6:
terraform destroy
Note
Make sure you destroyed the infrastructure on the AWS account you created the credentials, otherwise you’ll get charged money on your bank account while the service is active.
Contributing
Contributions are always welcome! So don’t be afraid of forking the repo and make a pull request.
Also, you can add any issue found on the project so it can be fixed asap.