rhymepurple

joined 2 years ago
[–] [email protected] 1 points 8 hours ago

That's true, but how often have you heard a finance team member wanting a CSV file so they can more easily process the data using Pandas or visualize it with MatPlotLib? How many accountants or finance people (especially those that ask for everything in Excel) do you know that is comfortable writing even a single line of Python code? How many of the finance team's Excel-based tools will Python integrate well with? What feature(s) does Python within Excel provide that Excel (formulas, pivot tables, VBA, Power Query, Power Pivot, etc.) does not provide that someone on the finance team would need? What advanced charting/dashboarding functionality does Python in Excel provide that isn't better accomplished in PowerBI (if not handled by standard Excel charts/graphs)?

Don't get me wrong - Microsoft's implementation of Python in Excel has its merits and will solve some problems that otherwise would not be possible in Excel and will make some people happy. However, this is not the solution most people were expecting, asking for, or find useful.

[–] [email protected] 9 points 2 days ago

I agree with everything you said, but (in Microsoft's eyes) this is a feature - not a bug.

Without this cloud component, how could:

  • Microsoft make sure that the accounting team does not introduce a malicious/old Python library into the Excel file?
  • Microsoft protect its users from writing/running inefficient, buggy, or malicious Python code?
  • Microsoft provide a Python runtime to users who do not know how to install Python?
  • Microsoft charge to run code that you wrote in a free, open source software programming language on a device that you own?
[–] [email protected] 18 points 2 days ago (10 children)

Over a year later and I still do not understand what the use case for this is.

A lot of the examples/documentation that was made by Microsoft for this seems to focus on data analysis and data visualization. Anyone in those fields would probably prefer to get the data out of Excel and into their tool/pipeline of choice instead of running their Python code in Excel. That also makes the big assumption that the data being used is fully contained within the Excel file and that the libraries used within the code are avalaible in Excel (including the library version).

For anyone looking to learn/use Excel better, I doubt the best use of their time is learning a new programming language and how Excel implements that programming language. They would likely be better off learning Excel's formulas, pivot tables, charts, etc. They could even learn Power Query to take things to another level.

For anyone looking to learn Python, this is absolutely a terrible way to do so. For example, it abstracts away library maintenance, could provide modified error messages, and makes the developer feedback loop more complicated.

If you want to automate Excel then this realistically allows for very little new functionality that did not exist prior to this feature. Using other Python libraries like OpenPyxl and xlWings will still be required to automate Excel.

I am sure there are edge cases where this iteration of Python in Excel is perfect. However, this feels like a checkbox filler ("yeah, Excel supports Python now") than an implementation of an actual useful feature. A fully featured and supported Python library that manipulates Excel/Excel files would have been a much more exciting and useful feature - even if it had to be executed outside of Excel, like OpenPyxl.

[–] [email protected] 4 points 3 weeks ago (1 children)

Take a look at QuickWeather if you want a map.

[–] [email protected] 12 points 3 weeks ago

This is definitely the wrong answer for this community, but may be an acceptable answer for this post. I have never used it nor would I ever recommend using it, but the conversations I have had with others who do use it make it seem like the service is far better than any alternative. Given the OP's requirements and willingness to both pay and sacrifice privacy, it seems like this may be appropriate for OP.

I would still explore other options though. There are several competitors to Life360 and presumably there are some with better privacy policies (even if the service would not typically be recommended on this community). Maybe OP could use a service like https://tosdr.org or https://tldrlegal.com to better evaluate those options that would likely not get much attention on this community.

Depending on the required features, maybe the Live Location Sharing feature of chat apps like Element may be sufficient. It could also help improve the privacy of the users' by switching to a more private/secure messaging app in the process.

[–] [email protected] 9 points 4 weeks ago (1 children)

The improvements sound great.

I did not look through the details, but it's strange that one of the features is that Cloudflare R2 will be used to improve download speeds and reduce API calls to Github while at the same time adding a new requirement of adding a personal Github API token.

Hopefully one day the Github requirement will be removed. It would be nice if projects/code stored on Gitlab, Codeberg, or other Git services like Gitea or Forgejo could be used without having to mirror/fork the project onto Github.

[–] [email protected] 2 points 1 month ago

In terms of privacy, you are giving your identity provider insight to each of the third party services that you use. It may seem that there isn't too much of a difference between using Google's SSO vs using your Gmail address to register your third party account. However, one big distinction is that Google would be able to see often and when you use each of your third party services.

Also, it may be impossible to restrict the sharing of certain information from your identity provider with the third party service. For example, maybe you don't want to share a picture of yourself with a service, but that service uses user profile pictures or avatars. That service may ask (and require) that you give it access to your Google account's profile picture in order to authenticate using Google's SSO. You may be able to overwrite that picture, but you also may not be able to revoke the service's ability to retrieve it. If you used a "regular" local account, that Google profile picture would never be shared with the third party service if you did not upload it directly. The same is true for other information like email, first/last/full name, birthday, etc.

There are other security and operational concerns with using SSO options. With the variety of password managers available, introduction of passkeys, and increased adoption of multi-factor authentication, many of the security benefits associated with SSO aren't as prevalent as they were 10 years ago. The biggest benefit is likely the convenience that SSO still brings compared to other authentication methods.

Ultimately it's up to you to determine if these concerns are worth the benefits of using SSO (or the third party service provider at all if they require SSO). I have a feeling the common advise will be to avoid SSO unless its an identity provider that you trust (or even better - one that you host yourself) - especially if you're using unique emails/usernames along with strong and unique passwords with multi-factor authentication and/or passkeys.

[–] [email protected] 8 points 1 month ago

There are a few performance issues that you may experience. For example, if you're into online gaming then your latency will likely increase. Your internet connection bandwidth could also be limited by either Mullvad's servers, your router, or any of the additional hops necessary due to the VPN. There's also the situation where you have no internet connection at all due to an issue with the VPN connection.

There are also some user experience issues that users on the network nay experience. For example, any location based services based on IP address will either not work at all or require manual updates by the user. The same is true for other settings like locale, but they are hopefully better handled via browser/system settings. What's more likely is content restrictions due to geographic IP addresses. Additionally, some accounts/activity could be flagged as suspicious, suspended, or blocked/deleted if you change servers too frequently.

I'm sure you are either aware of or thought through most of that, but you may want to make sure everyone on the network is fine with that too.

In terms of privacy and security, it really comes down to your threat model. For example, if you're logged into Facebook, Google, etc. 24/7, use Chrome, Windows, etc., and never change the outbound Mullvad server, you're not doing too much more than removing your ISP's ability to log your activity (and maybe that's all you want/need).

[–] [email protected] 14 points 1 month ago

Ultra-wideband

In addition to other use cases, it is used to precisely identify where a device is in relation to another one.

[–] [email protected] 2 points 1 month ago

I think there may be an issue where F-Droid is not properly recognizing the 64-bit version of Findroid. Maybe Droid-ify and/or the version of Android you are using won't allow 32-bit apps to be installed.

[–] [email protected] 11 points 1 month ago (1 children)

Just to clarify - this is just an update that (I believe) is only available on IzzyOnDroid's F-Droid Repo, which previously had prior Findroid versions available. This new v0.15.0 is not available on the main F-Droid Repo.

Is anyone only able to download the 32-bit version of this app via F-Droid? It looks like a 64-bit version has been made available starting with v0.3.0 and is also available on this new version.

[–] [email protected] 2 points 1 month ago (1 children)

Really not sure why you got down voted so hard and it's a shame your comment was deleted. Your comment was relevant, accurate, and focused on an issue that others aren't talking about in here (and apparently don't want to). You were also the only person in this thread who provided any sources.

I'm not sure what argument can be made against what you said. Just because a piece of information "is public" doesn't mean everyone wants that public information collected and shared with little (if any) control/input by you. If that were the case, doxxing wouldn't be an issue.

 

cross-posted from: https://lemmy.ml/post/16693054

Is there a feature in a CI/CD pipeline that creates a snapshot or backup of a service's data prior to running a deployment? The steps of a ideal workflow that I am searching for are similar to:

  1. CI tool identifies new version of service and creates a pull request
  2. Manually merge pull request
  3. CD tool identifies changes to Git repo
    1. CD tool creates data snapshot and/or data backup
    2. CD tool deploys update
  4. Issue with deployment identified that requires rollback
    1. Git repo reverted to prior commit and/or Git repo manually modified to prior version of service
    2. CD tool identifies the rolled back version
      1. (OPTIONAL) CD tool creates data snapshot and/or data backup
      2. CD tool reverts to snapshot taken prior to upgrade
      3. CD tool deploys service to prior version per the Git repo
  5. (OPTIONAL) CD tool prunes data snapshot and/or data backup based on provided parameters (eg - delete snapshots after _ days, only keep 3 most recently deployed snapshots, only keep snapshots for major version releases, only keep one snapshot for each latest major, minor, and patch version, etc.)
8
submitted 3 months ago* (last edited 3 months ago) by [email protected] to c/[email protected]
 

Is there a feature in a CI/CD pipeline that creates a snapshot or backup of a service's data prior to running a deployment? The steps of a ideal workflow that I am searching for are similar to:

  1. CI tool identifies new version of service and creates a pull request
  2. Manually merge pull request
  3. CD tool identifies changes to Git repo
    1. CD tool creates data snapshot and/or data backup
    2. CD tool deploys update
  4. Issue with deployment identified that requires rollback
    1. Git repo reverted to prior commit and/or Git repo manually modified to prior version of service
    2. CD tool identifies the rolled back version
      1. (OPTIONAL) CD tool creates data snapshot and/or data backup
      2. CD tool reverts to snapshot taken prior to upgrade
      3. CD tool deploys service to prior version per the Git repo
  5. (OPTIONAL) CD tool prunes data snapshot and/or data backup based on provided parameters (eg - delete snapshots after _ days, only keep 3 most recently deployed snapshots, only keep snapshots for major version releases, only keep one snapshot for each latest major, minor, and patch version, etc.)
3
submitted 1 year ago* (last edited 1 year ago) by [email protected] to c/[email protected]
 

I'm trying to find a video that demonstrated automated container image updates for Kubernetes, similar to Watchtower for Docker. I believe the video was by @[email protected] but I can't seem to find it. The closest functionality that I can find to what I recall from the video is k8s-digester. Some key features that were discussed include:

  • Automatically update tagged version number (eg - Image:v1.1.0 -> Image:v1.2.0)
  • Automatically update image based on tagged image's digest for tags like "latest" or "stable"
  • Track container updates through modified configuration files
    • Ability to manage deploying updates through Git workflows to prevent unwanted updates
  • Minimal (if any) downtime
  • This may not have been in the video, but I believe it also discussed managing backups and rollback functionality as part of the upgrade process

While this tool may be used in a CI/CD pipeline, its not limited exclusively to Git repositories as it could be used to monitor container registries from various people or organizations. The tool/process may have also incorporated Ansible.

If you don't know which video I'm referring to, do you have any suggestions on how to achieve this functionality?

EDIT: For anyone stumbling on this thread, the video was Meet Renovate - Your Update Automation Bot for Kubernetes and More! by @[email protected], which discusses the Kubernetes tool Renovate.

view more: next ›