#StackBounty: #magento-2.1 #product #database #attributes Where does Magento store the PRODUCT_HAS_WEIGHT attribute

Bounty: 50

For some of our configured products (unfortunately), the PRODUCT_HAS_WEIGHT attribute has not been set to 1. Although, the products all have weight set (as a float), Magento still refuses to provide shipping calculation for these products.

I want to modify the PRODUCT_HAS_WEIGHT property for all products in the database. For that I want to access the MySQL database and issue a query that updates all the corresponding fields. Unfortunately, I was unsuccessful in identifying the attribute to update.

For example, to examine the weights, I would do the following:

 (1) USE database;
 (2) SELECT * FROM eav_attribute where attribute_code LIKE '%weight%';
 (3) SELECT * FROM catalog_product_entity_decimal where attribute_id = 82;`

In a similar fashion as in line (2), product has weight is not found.

I’d kindly ask the community, to help me find the related attribute to update for each product.

I’m using Magento CE 2.1.6.


Get this bounty!!!

#StackBounty: #php #mysql #database #rest #codeigniter Create API endpoint for fetching dynamic data based on time

Bounty: 50

I have a scraper which periodically scrapes articles from news sites and stores them in a database [MYSQL].
The way the scraping works is that the oldest articles are scraped first and then i move onto much more recent articles.

For example an article that was written on the 1st of Jan would be scraped first and given an ID 1 and an article that was scraped on the 2nd of Jan would have an ID 2.

So the recent articles would have a higher id as compared to older articles.

There are multiple scrapers running at the same time.

Now i need an endpoint which i can query based on timestamp of the articles and i also have a limit of 10 articles on each fetch.

The problem arises for example when there are 20 articles which were posted with a timestamp of 1499241705 and when i query the endpoint with a timestamp of 1499241705 a check is made to give me all articles that is >=1499241705 in which case i would always get the same 10 articles each time,changing the condition to a > would mean i skip out on the articles from 11-20. Adding another where clause to check on id is unsuccessful because articles may not always be inserted in the correct date order as the scraper is running concurrently.

Is there a way i can query this end point so i can always get consistent data from it with the latest articles coming first and then the older articles.

EDIT:

   +-----------------------+
   |   id | unix_timestamp |
   +-----------------------+
   |    1 |   1000         |
   |    2 |   1001         |
   |    3 |   1002         |
   |    4 |   1003         |
   |   11 |   1000         |
   |   12 |   1001         |
   |   13 |   1002         |
   |   14 |   1003         |
   +-----------------------+

The last timestamp and ID is being sent through the WHERE clause.

E.g.
$this->db->where('unix_timestamp <=', $timestamp);
$this->db->where('id <', $offset);
$this->db->order_by('unix_timestamp ', 'DESC');
$this->db->order_by('id', 'DESC');

On querying with a timestamp of 1003, ids 14 and 4 are fetched. But then during the next call, id 4 would be the offset thereby not fetching id 13 and only fetching id 3 the next time around.So data would be missing .


Get this bounty!!!

#StackBounty: #database #images #files #managed-files Is it possible to convert separate (matching) managed files to a single shared fi…

Bounty: 50

I’m finishing up a Commerce store that has tens of thousands of imported products. In a slightly simplified form, here’s my problem: Each has a unique primary image and a common secondary image. The import has created a separate secondary image of the exact same file for each product, which obviously creates enormous redundancy and uses much disk space unnecessarily.

Is there a safe way to convert all of the secondary images into a single, shared image in the database and filesystem? (I’m envisioning some kind of database SQL to make multiple products reference the same image, and then hoping that Drupal will clean up the orphan files automatically — am I looking in the right direction?)

(Drupal 7)


Get this bounty!!!

#StackBounty: #magento2 #database #attributes #event-observer #media-images how to save image custom attribute in magento 2

Bounty: 200

preview in backend

preview in backend 2

I need to display few images of product in frontend based on condition: use for virtual mirror should be checked.

<?php
/**
 * Copyright © Magento, Inc. All rights reserved.
 * See COPYING.txt for license details.
 */

namespace DcwVmObserver;

use MagentoFrameworkEventObserverInterface;

class ChangeTemplateObserver extends MagentoProductVideoObserverChangeTemplateObserver
{
    /**
     * @param mixed $observer
     * @SuppressWarnings(PHPMD.UnusedFormalParameter)
     * @return void
     */
    public function execute(MagentoFrameworkEventObserver $observer)
    {
        $observer->getBlock()->setTemplate('Dcw_Vm::helper/gallery.phtml');
    }
}

Template:

" value="1" class="admin__control-checkbox" name="[][vm]" checked="checked" />
</div> </div>

Install script:

<?php

namespace DcwVmSetup;

use MagentoFrameworkSetupInstallSchemaInterface;
use MagentoFrameworkSetupSchemaSetupInterface;
use MagentoFrameworkSetupModuleContextInterface;
use MagentoCatalogModelResourceModelProductGallery;

class InstallSchema implements InstallSchemaInterface {

    public function install(SchemaSetupInterface $setup, ModuleContextInterface $context) {
        $setup->startSetup();

        $setup->getConnection()->addColumn(
                $setup->getTable(Gallery::GALLERY_TABLE), 'vm', [
            'type' => MagentoFrameworkDBDdlTable::TYPE_SMALLINT,
            'unsigned' => true,
            'nullable' => false,
            'default' => 0,
            'comment' => 'use for Vm'                ]
        );

        $setup->endSetup();
    }

}

How to save checked images state in backend? And how to filter those images in frontend? Can you help me on this?

UPDATE:

following observer (on event catalog_product_save_after) for existing images is working, but for new images not working.

<?php

namespace DcwVmObserver;

use MagentoFrameworkEventObserverInterface;

class Productsaveafter implements ObserverInterface {

    protected $request;
    protected $resource;

    /**
     * 
     * @param MagentoFrameworkAppRequestInterface $request
     * @param MagentoFrameworkAppResourceConnection $resource
     */
    public function __construct(
    MagentoFrameworkAppRequestInterface $request, MagentoFrameworkAppResourceConnection $resource
    ) {
        $this->request = $request;
        $this->resource = $resource;
    }

    public function execute(MagentoFrameworkEventObserver $observer) {

        $vm = array();
        $data = $this->request->getPostValue();

        if (isset($data['product']['media_gallery']['images'])) {
            $images = $data['product']['media_gallery']['images'];

            foreach ($images as $image) {
                if (isset($image['vm']) && $image['vm'] == 1) {
                    $vm[$image['value_id']] = 1;
                } else {
                    $vm[$image['value_id']] = 0;
                }
            }
   // print_r($images);exit;
            $connection = $this->resource->getConnection();
            $tableName = 'catalog_product_entity_media_gallery'; //gives table name with prefix
            $product = $observer->getProduct();
            $mediaGallery = $product->getMediaGallery();

            if (isset($mediaGallery['images'])) {
                foreach ($mediaGallery['images'] as $image) {
                    if (isset($vm[$image['value_id']])) {
                        //Update Data into table
                        $sql = "Update " . $tableName . " Set vm = " . $vm[$image['value_id']] . " where value_id = " . $image['value_id'];
                        $connection->query($sql);
                    }
                }
            }
        }
    }

}


Get this bounty!!!

#StackBounty: #swift #database How to efficiently check database object based on location/proximity to user's location?

Bounty: 50

I am constructing an app (in XCode) which, in a general sense, displays information to users. The information is stored as individual objects in a database (happens to be a Parse-server hosted by heroku). The user can elect to “see” information that has been created within a set distance from their current location. (The information, when saved to the DB, is saved along with its lat and long based on the location of the user when they initiated the save). I know I can filter the pieces of information by comparing their lat and long to the viewing user’s current lat and long and only display those which are close in enough. Roughly/generally:

var currentUserLat = latitude //latitude of user's current location
var infoSet = [Objects] //set of all pulled info from DB
for info in infoSet{
    if info.lat-currentUserLat < 3{//arbitrary value
       //display the info
    }else{
       //don't display
    }
}

This is set up decently enough, and it works fine. The reason it works fine, though, is because of the small number of entries in the DB at this current time (the app is in development). Under practical usage (ie many users) the DB may be full of information objects (lets say, a thousand). In my opinion, to individually pull and compare the latitude of the information and compare it to the current user’s latitude for each and every DB entry would take too long. I know there must be a way to do it in a timely manner (think tinder… they only display profiles of people who are in the near vicinity and it doesn’t take that long for them to do so despite millions of profiles) but I do not know what is most efficient. I thought of creating separate sections for different geographical regions in the DB and then only searching those particular section of the DB depending on where the user’s current location is, but this seems unsophisticated and would still lead to large amounts of info being pulled. What is the best way to do this?


Get this bounty!!!

#StackBounty: #database #cloud-service #synchronization #postgresql Sync local postgres db to a cloud postgres db, as a Windows Service

Bounty: 100

I’m looking for a software that will sync a local postgres db to a db in the cloud, like what Tableau can do with this service.

This software has to be installed like a Windows service. It can be installed in any Windows machine.

The data will be pulled from the local db and refreshed to a postgres db, hosted in the cloud.

The way is always local to remote and the refreshing period is within a frame of a few minutes.

The service and the cloud host can be the same provider. Actually we are looking for a solution that will be part of the same ecosystem.

Thanks


Get this bounty!!!

#StackBounty: #database #cloud-service #synchronization #postgresql Software installing a windows local service sync postgres local db …

Bounty: 100

I’m looking for a software that will sync a local postgres db to a db in the cloud, like what Tableau can do with this service.

This software has to be installed like a Windows service. It can be installed in any Windows machine.

The data will be pulled from the local db and refreshed to a postgres db, hosted in the cloud.

The service and the cloud host can be the same provider. Actually we are looking for a solution that will be part of the same ecosystem.

Thanks


Get this bounty!!!

#StackBounty: #windows #linux #database #mysql Automated mysql table duplication between servers, via SQL

Bounty: 50

The problem: I need to backup a few servers databases (sometimes a single table, other times all tables in the database) and also in once case replicate a database on a nightly basis.

So this is database to database copying. You may say, use mysql replication however I’m trying to achieve this without using mysql replication as I don’t always have access to configure this on remotely hosted servers.

To clarify, these are servers at different locations and not in the same location. I’m aware of many solutions when the tables exist in databases on the same server but options seem much more limited when this isn’t the scenario.

I need this to run on a schedule I can setup, say once a day in a totally automated fashion, copying and overwriting the target tables with the data from the source.

I’m aware through the research I’ve done of a windows commercial product that can do this (SQLyog) but I wanted to see if anyone knows of alternatives and maybe for Linux as well as Windows.


Get this bounty!!!

#StackBounty: #windows #linux #database #mysql Automated mysql table duplication between servers

Bounty: 50

The problem: I need to backup a few servers databases (sometimes a single table, other times all tables in the database) and also in once case replicate a database on a nightly basis.

So this is database to database copying. You may say, use mysql replication however I’m trying to achieve this without using mysql replication as I don’t always have access to configure this on remotely hosted servers.

To clarify, these are servers at different locations and not in the same location. I’m aware of many solutions when the tables exist in databases on the same server but options seem much more limited when this isn’t the scenario.

I need this to run on a schedule I can setup, say once a day in a totally automated fashion, copying and overwriting the target tables with the data from the source.

I’m aware through the research I’ve done of a windows commercial product that can do this (SQLyog) but I wanted to see if anyone knows of alternatives and maybe for Linux as well as Windows.


Get this bounty!!!

#StackBounty: #7 #theming #database #performance Is putting content in the codebase or the database better for performance?

Bounty: 50

I recently created a website footer using a block template. Some of the footer’s content was entered through the block’s wysiwyg (stored in the database), but other code-heavy elements were placed in the block template file (stored in code). I’ve noticed that the footer block loads slow at times. Other times, the content inserted through the wysiwyg loads a few seconds before the content that was inserted in the the block template.

I’m curious, is storing html content in the database better for performance than storing html content in template files?

EDIT: After I posted this, I also realized that the footer is being rendered with Blocks whereas the rest of the page is being render with Panels. I’m not sure if that would have any effects of its performance.


Get this bounty!!!